US20140258971A1 - Dynamic editors for functionally composed ui - Google Patents

Dynamic editors for functionally composed ui Download PDF

Info

Publication number
US20140258971A1
US20140258971A1 US14/285,039 US201414285039A US2014258971A1 US 20140258971 A1 US20140258971 A1 US 20140258971A1 US 201414285039 A US201414285039 A US 201414285039A US 2014258971 A1 US2014258971 A1 US 2014258971A1
Authority
US
United States
Prior art keywords
elements
model data
real
editor
declarative model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/285,039
Inventor
Siddharth Jayadevan
Christopher Donald Hackmann
Stephen M. Danton
Simon David Kent
Michael Charles Murray
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US14/285,039 priority Critical patent/US20140258971A1/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JAYADEVAN, SIDDHARTH, DANTON, STEPHEN M., HACKMANN, CHRISTOPHER DONALD, KENT, SIMON DAVID, MURRAY, MICHAEL CHARLES
Publication of US20140258971A1 publication Critical patent/US20140258971A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/34Graphical or visual programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces

Definitions

  • Computers have become highly integrated in the workforce, in the home, in mobile devices, and many other places. Computers can process massive amounts of information quickly and efficiently.
  • Software applications designed to run on computer systems allow users to perform a wide variety of functions including business applications, schoolwork, entertainment and more. Software applications are often designed to perform specific tasks, such as word processor applications for drafting documents, or email programs for sending, receiving and organizing email.
  • the present invention extends to methods, systems, and computer program products for providing a real-time incremental editor for enacting changes on a user interface (UI) and presenting a preview of the potential results of a user edit to data model data.
  • UI user interface
  • Embodiments include presenting a first user interface (UI) of an executing application program based on a UI model that includes declarative model data that drives the behavior of the first UI.
  • the UI model includes particular declarative model data that declaratively defines a visual appearance of one or more UI elements of the first UI. Each of the UI element(s) is an instance of the particular declarative model data.
  • a second UI of a real-time incremental editor is presented.
  • the second UI is separate from the application program and is configured to receive one or more editing inputs that are to be applied to the particular declarative model data of the UI model, to edit the first UI while the application program is executing.
  • a first editing user input is received at the second UI.
  • the first editing input selects the particular declarative model data that declaratively defines the UI element(s). Based on selection of the particular declarative model data within the real-time incremental editor, each of the UI element(s) is visually distinguished within the first UI as being selected at the second UI of the real-time incremental editor.
  • a second editing user input is received at the second UI of the real-time incremental editor.
  • the second editing user input provides one or more changes to the particular declarative model data corresponding to the UI element(s), in order to enact one or more desired edits to each of the UI element(s).
  • the first UI is updated to reflect the desired edits to the UI element(s), including altering the visual appearance of each of the UI element(s) based on the changes to the particular declarative model data corresponding to the UI element(s).
  • FIG. 1 illustrates a computer architecture in which embodiments of the present invention may operate including providing a real-time incremental editor for enacting changes on a user interface (UI) and presenting a preview of the potential results of a user edit to data model data.
  • UI user interface
  • FIG. 2 illustrates a flowchart of an example method for providing a real-time incremental editor for enacting changes on a user interface (UI).
  • UI user interface
  • FIG. 3 illustrates a flowchart of an example method for presenting a preview of the potential results of a user edit to data model data.
  • FIG. 4 illustrates an embodiment of the present invention in which a UI is dynamically updated.
  • FIG. 5 illustrates an embodiment of the present invention in which selected elements are highlighted in a running application
  • the present invention extends to methods, systems, and computer program products for providing a real-time incremental editor for enacting changes on a user interface (UI) and presenting a preview of the potential results of a user edit to data model data.
  • a computer system instantiates a real-time incremental editor configured to receive editing inputs that are to be applied to data of a data model, where the data model data declaratively defines a UI model that drives the behavior of a corresponding UI based on the UI model.
  • the computer system receives an editing input from a user indicating one or more desired edits that are to be made to the UI and determines, based on the received editing inputs, which changes are to be made to the data model data to enact the desired changes in the UI.
  • the computer system updates the UI in real-time based on the determined edits to the data model data and presents the updated UI in real-time without losing state information in the resultant UI.
  • a computer system instantiates a real-time incremental editor configured to receive editing inputs that are to be applied to data of a data model, where the data model data declaratively defines a UI model that drives the behavior of a corresponding UI based on the UI model.
  • the computer system receives an input from a user at the instantiated editor, where the input indicates various edits that are to be applied to a running software application.
  • the computer system determines, based on the received user input, which changes are to be made to the data model data to enact the desired changes in the running software application and previews the determined changes in real-time on the running application.
  • Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below.
  • Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions, computer program products and/or data structures.
  • Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system.
  • Computer-readable media that store computer-executable instructions are physical storage media.
  • Computer-readable media that carry computer-executable instructions are transmission media.
  • embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
  • Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • a “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
  • a network or another communications connection can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
  • program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa).
  • computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system.
  • a network interface module e.g., a “NIC”
  • NIC network interface module
  • computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
  • Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
  • the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
  • the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like.
  • the invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
  • program modules may be located in both local and remote memory storage devices.
  • FIG. 1 illustrates a computer architecture 100 in which the principles of the present invention may be employed.
  • Computer architecture 100 includes editors A and B ( 110 A and 110 B, respectively). In some embodiments, only one editor may be used. In other embodiments, two or more editors may be used.
  • Editors as used herein, may refer to development environments in which a software developer may design or otherwise construct a software application or element thereof. The editors may comprise all different types of editors including text editors, graphical editors (e.g. tree/diagram editors) and other types of editors. The editors may provide a textual, graphical or other representation of a model. This model may be used as the basis for a software application (e.g. running application 135 ) or for the software application's user interface ( 111 ). Thus, in some cases, a developer may use an editor (e.g. 110 A) to design or edit a model that forms the basis of user interface 111 .
  • an editor e.g. 110 A
  • UI 111 may be a data-driven interface.
  • the UIs may have a UI model which is based on a data model (e.g. model 120 ).
  • Such data-driven interfaces may be changed in real-time (or essentially real-time) by changing the data ( 121 ) of the data model ( 120 ).
  • update determining module 115 may be used to determine, based on the received determined changes 113 from the editor, which changes are to be made to the data to bring about the desired changes in the UI.
  • Update determining module 115 may also be configured to propagate any changes or edits made to editor A ( 110 A) by user 105 to editor B ( 110 B), and vice versa. Thus, edits from editor B 116 may be propagated to editor A and edits from editor A 117 may be propagated to editor B. In cases where other editors exist, edits may also be propagated to the other editors. The propagation of edits may occur because of a two-way binding between the update determining module 115 (a data flow engine in some embodiments) and the editors.
  • update determining module 115 may output the UI model updates 123 that are to be made to the UI ( 111 ) of running application 135 .
  • Runtime model interpreter may receive the UI model updates and, in real-time, output the actual UI changes or updates 131 that are to be made to the UI.
  • UI 111 is updated dynamically, in real-time based on the editing inputs provided by the user 105 .
  • the changes may be viewed within the editor as the UI is altered based on the runtime model interpreter's outputs.
  • FIG. 2 illustrates a flowchart of a method 200 for providing a real-time incremental editor for enacting changes on a user interface (UI).
  • the method 200 will now be described with frequent reference to the components and data of environments 100 of FIGS. 1 and 400 of FIG. 4 .
  • Method 200 includes an act of instantiating a real-time incremental editor configured to receive editing inputs that are to be applied to data of a data model, wherein the data model data declaratively defines a UI model that drives the behavior of a corresponding UI based on the UI model (act 210 ).
  • a computer system may instantiate real-time incremental editor 110 A which is configured to receive editing inputs 106 that are to be applied to data 121 of data model 120 .
  • Data model data 121 declaratively defines a UI model that drives the behavior of UI 111 which is based on the UI model.
  • declaratively defining a model refers to allowing the user to design or specify the model using data, and not rigid logic.
  • That data can then be analyzed and interpreted in real-time without needing to be compiled first. Accordingly, by having data model data 121 that declaratively defines a model, simple changes to the data model data can be applied in real-time without having to recompile or even stop a running software application that is based on the data model.
  • data model 120 may be structured according to a schema. This may be referred to herein as a schematized model.
  • Data model 120 may include many different types of data. This data may be stored in a local data store or may be stored remotely, perhaps even on a number of different computer systems such as on a storage area network (SAN).
  • data model data 121 may be divided into chunks. These chunks may be based on corresponding portions of a UI tree. For instance, a UI tree that has multiple different elements as part of the tree, may have a chunk of data corresponding to each element of the tree. In some cases, these chunks may comprise text chunks.
  • the text chunks may be structured such that data model (schema) grammar that defines how the data 121 is to be structured is not violated. Accordingly, in such cases, bad or improperly functioning states may be avoided as the text chunks are structured in such a way as to prevent schema grammar violations.
  • data model schema
  • a two-way data binding may be applied between a textual editor view and the text chunks of the data model data, such that any edits to the data model data cause the textual editor to update automatically.
  • editor A may have a two-way data binding between itself and the text chunks of the data model data 121 .
  • any edits made to data model data 121 e.g. as a result of editing inputs 106 ) cause the textual editor (editor A) to update automatically.
  • Method 200 includes an act of receiving an editing input from a user indicating one or more desired edits that are to be made to the UI (act 220 ).
  • editor A 110 A
  • These edits may comprise text edits, graphical edits or any other type of changes or modifications that can be made to UI 111 .
  • Editor A may be configured to output determined changes 113 indicating the editor's interpretation of the user's editing inputs. For instance, if user 105 indicates in editing input 106 that a diagram element is to be changed, the editor may identify in changes 113 which element is to be changed.
  • the editor would identify the current position of the element and the new (moved) position of the element.
  • This is one example among many, many different changes that can be made. Accordingly, the example is intended merely to illustrate how changes 113 may be generated and should not be read as limiting the type or number or style of changes that can be made.
  • Method 200 also includes an act of determining, based on the received editing inputs, which changes are to be made to the data model data to enact the desired changes in the UI (act 230 ).
  • update determining module 115 may determine, based on received editing inputs 106 (and/or determined changes 113 ), which UI model updates 123 are to be made to data model data 121 (upon which UI 111 is built) to enact the desired changes in the UI.
  • These UI model updates may comprise, for example, an indication of which elements are to be changed and by how much.
  • the UI model updates 123 may, at least in some cases, be transformed into UI changes using purely functional transformations. Because the UI is data-driven, data model data 121 may be changed in a purely functional transformation, without recompiling the UI or even stopping it to make the changes.
  • the UI model updates 123 indicating how the data model data 121 is to be transformed into UI changes 131 are data and can be changed, manipulated or transferred in the same way data is normally handled.
  • the data indicating how the data model data 121 is to be transformed into UI changes 131 may be mapped to a structured UI representation including text, a UI tree, a graphical diagram or any other type of structured UI representation.
  • update determining module 115 may comprise a data flow engine.
  • a data flow engine may refer to a software engine that supports functional composition and the use of queries for property-binding, repeated object construction, and conditional object construction.
  • the data flow engine may be configured to receive and process functional application descriptions based on models. For instance, the data flow engine may be configured to process extensible markup language (XML) application foundation (XAF) applications or other functional application descriptions. Additionally or alternatively, the data flow engine may be configured to process queries constructed by the editors.
  • XML extensible markup language
  • XAF application foundation
  • XAF is a software application framework for generating a runtime structure for a given software application and for managing execution of the software application.
  • Applications may be generated by an application framework that includes a collection or graph of connected application components.
  • Functionalities of an application configured according to the application framework are enabled by dynamically configuring groups of application components into component domains where each domain is configured to enable a given functionality of the application, for example, displaying a picture in a word processing document.
  • an application may pass an application description for each piece of application functionality to an application description engine.
  • the application description provides declarative rules for structuring and composing the component domains, and the application description engine is operative to interpret the declarative rules for creating and reconfiguring the component domains as required based on data events received by the application.
  • Data events may be any type of software message, user input, or other indication that something has occurred that may affect the software application. Data events may occur, for example, when a user clicks a mouse to interact with the application, or when an error message is presented during processing of the application.
  • XAF may be configured to track dependencies such that data evaluations and object graph constructions may be incrementally updated when data changes.
  • XAF may track dependencies in data such that a software application may be incrementally updated when the data is updated. In some embodiments, this incremental updating may be carried out automatically by a XAF engine (e.g. the data flow engine or update determining module 115 ).
  • data model data 121 may comprises a schematized XAF application description.
  • This XAF application description may, in turn, be interpreted by the XAF engine (or update determining module 115 ).
  • at least one of the real-time incremental editors 110 A/ 110 B may be instantiated based on a XAF application description.
  • XAF engines and XAF application descriptions are provided as examples, any type of functional application description language or functional application description language engine may be used.
  • Method 200 includes an act of updating the UI in real-time based on the determined edits to the data model data (act 240 ).
  • runtime model interpreter 130 may receive UI model updates, interpret those model updates in real-time (or substantially real-time) and send the UI updates 131 back to the editor(s). These changes may be applied to the UI(s) dynamically in real-time, without stopping or recompiling the running UI. In some cases, only the minimal set of controls and properties (or other elements) are updated based on determined data dependencies.
  • update determining module 115 determines that the determined data changes to element A, for example, affect elements C, D and F based on determined data dependencies, only elements A, C, D and F will be updated while the other elements will be left alone.
  • update determining module 115 comprises a data flow engine
  • the data flow engine may be configured to support transformations of collections, records and atoms (and any other forms of data model data) as well as track dependencies across data model data items. By tracking such data dependencies, the minimal set of elements to be changed may be discovered and other elements may be left unchanged.
  • a UI model interpreter may be instantiated to perform each the following steps in real-time: receive a declaratively defined UI model, interpret the UI model via functional transformation using the data flow engine and present the resulting UI to the user.
  • method 200 includes an act of presenting the updated UI in essentially real-time without losing state information in the resultant UI (act 250 ).
  • running application 135 may present the visualized results of the updated UI 111 in essentially real-time without losing state information in the UI.
  • real-time, essentially real-time or substantially real-time appear, these terms are referring to a short time frame and not to an instantaneous event. Accordingly, even in real-time, a delay may occur between the time an input is received and when that input is applied.
  • Environment 400 includes an editor and a corresponding UI shown at two different times (i.e. editor at time 1 (T 1 ) 410 with corresponding UI at T 2 411 and editor at time 2 (T 2 ) 420 with corresponding UI at T 2 421 ).
  • Editor at T 1 includes a UI 411 that has three items: input box 412 , button 413 and selectable/draggable object 414 .
  • a user may indicate that selectable/draggable object 414 is to be moved to a position above input box 412 , that button 413 is to be deleted and that a menu box is to be added.
  • a user's edits may be analyzed to determine which changes are to be made to the user interface (or other software application) and applied to the user interface in real-time, while the UI is running, without stopping the application and without losing state information.
  • the potential results of a user's edits may be previewed within an editor, as explained below with regard to method 300 of FIG. 3 .
  • FIG. 3 illustrates a flowchart of a method 300 for presenting a preview of the potential results of a user edit to data model data.
  • the method 300 will now be described with frequent reference to the components and data of environments 100 of FIGS. 1 and 500 of FIG. 5 .
  • Method 300 includes an act of instantiating a real-time incremental editor configured to receive editing inputs that are to be applied to data of a data model, wherein the data model data declaratively defines a UI model that drives the behavior of a corresponding UI based on the UI model (act 310 ).
  • a computer system may instantiate incremental real-time editor A ( 110 A) which may be configured to receive editing inputs 106 that are to be applied to data 121 of data model 120 .
  • data model data 121 declaratively defines a UI model that drives the behavior of UI 111 which is based on the UI model. Because the UI model is declaratively defined, simple changes to the model data can be applied in real-time without having to recompile or even stop a running software application that is based on the data model.
  • Method 300 further includes an act of receiving an input from a user at the instantiated editor, the input indicating one or more edits that are to be applied to a running software application (act 320 ).
  • editor B 110 B
  • These edits may comprise text edits, graphical edits or any other type of changes or modifications that can be made to UI 111 .
  • Method 300 also includes an act of determining, based on the received user input, which changes are to be made to the data model data to enact the desired changes in the running software application (act 330 ).
  • update determining module 115 may determine, based on received editing inputs 106 (and/or determined changes 113 ), which UI model updates 123 are to be made to data model data 121 (upon which UI 111 is built) to enact the desired changes in the UI.
  • These UI model updates may comprise, for example, an indication of which elements are to be changed and by how much.
  • Method 300 includes an act of previewing the determined changes in real-time on the running application (act 340 ).
  • running application 135 may be configured to preview the determined UI model updates 123 which are based on determined changes 113 which are in turn based on editing input 106 .
  • These UI model updates may be applied to a running UI by runtime model interpreter 130 which interprets the UI model (and changes thereto) at runtime and provides the actual UI updates which are displayed in the UI ( 111 ) of the running application ( 135 ).
  • runtime model interpreter 130 interprets the UI model (and changes thereto) at runtime and provides the actual UI updates which are displayed in the UI ( 111 ) of the running application ( 135 ).
  • the results of those inputs may be displayed in the running application.
  • user 105 may select an element of the data model (i.e. a UI element) in the editor and the editor may display, in real-time, a highlight around each corresponding instance of the selected element on the running application.
  • editor 510 may present UI 515 to the user with various elements including a menu, input box and other items A, B, C and D.
  • User 105 may select item B (i.e. selected item 516 ) and, upon selection, any corresponding items in the running application that are implemented based on item B may be highlighted.
  • items B 1 , B 2 and B 3 each of which implement item B of UI 515 , are highlighted.
  • the user can easily look at the running application and identify each of the items or areas where the selected item is being used. This can be helpful in debugging the running application.
  • the other UI items of the running application i.e. menu 521 , item A 522 A, item C 522 C, item D 522 D and input box 523 are not highlighted unless or until the item is selected in the editor.
  • the selected item may be identified within the running application using a query.
  • Editor 510 may be configured to construct a query that is used to query for each instance of the selected element. The editor can then identify the items based on the response to the query.
  • a host environment may be configured to display the highlighting around each corresponding instance of the selected element on the running application.
  • user inputs may be received at the editor that result in a new feature being added to the running software application. In such cases, the new features may be dynamically added to the running software application in real-time.
  • an environment in which a user may select a UI (or other) item and each corresponding instance of that selected item will be highlighted in the running application, without stopping and/or restarting the application.
  • a development environment may be provided where a user's editing inputs are analyzed to determine which changes are to be made to the user interface (or other software application) and are applied to the user interface in real-time, while the UI is running, without stopping the application and without losing state information.

Abstract

Embodiments include presenting a first user interface (UI) of an executing application, based on model data that defines an appearance of UI elements of the first UI. Concurrent to presenting the first UI, a second UI of a real-time editor is presented. The second UI is configured to receive editing inputs that are to be applied to the model data, to edit the first UI while the application is executing. A first editing input is received at the second UI, which selects particular model data defining particular UI element(s). Based on the first editing input at the second UI, each of the particular UI element(s) is visually distinguished within the first UI. A second editing input is received at the second UI, which provides changes to the model data corresponding to the particular UI element(s). The particular UI element(s) of the first UI are updated to reflect the desired changes.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 12/637,353, filed Dec. 14, 2009, and entitled “DYNAMIC EDITORS FOR FUNCTIONALLY COMPOSED UI,” which application claims priority to U.S. Provisional Patent Application Ser. No. 61/261,701, filed Nov. 16, 2009 and entitled “DYNAMIC EDITORS FOR FUNCTIONALLY COMPOSED UI.” The entire contents of the foregoing applications are incorporated by reference herein in their entirety.
  • BACKGROUND
  • Computers have become highly integrated in the workforce, in the home, in mobile devices, and many other places. Computers can process massive amounts of information quickly and efficiently. Software applications designed to run on computer systems allow users to perform a wide variety of functions including business applications, schoolwork, entertainment and more. Software applications are often designed to perform specific tasks, such as word processor applications for drafting documents, or email programs for sending, receiving and organizing email.
  • In many cases, software applications are designed to interact with or even design other software applications. For instance, integrated development environments can be used by developers to write and debug software applications. Some of these environments are specialized for designing software application user interfaces. These specialized user interface designers are often configured to provide a what-you-see-is-what-you-get (WYSIWYG) design experience. Using such a designer, a developer may simply drag and drop user interface (UI) elements onto the UI that is being designed, and the specialized UI designer application will write the software code corresponding to the user's drag and drop actions.
  • BRIEF SUMMARY
  • The present invention extends to methods, systems, and computer program products for providing a real-time incremental editor for enacting changes on a user interface (UI) and presenting a preview of the potential results of a user edit to data model data.
  • Embodiments include presenting a first user interface (UI) of an executing application program based on a UI model that includes declarative model data that drives the behavior of the first UI. The UI model includes particular declarative model data that declaratively defines a visual appearance of one or more UI elements of the first UI. Each of the UI element(s) is an instance of the particular declarative model data.
  • Concurrent to presenting the first UI, a second UI of a real-time incremental editor is presented. The second UI is separate from the application program and is configured to receive one or more editing inputs that are to be applied to the particular declarative model data of the UI model, to edit the first UI while the application program is executing.
  • A first editing user input is received at the second UI. The first editing input selects the particular declarative model data that declaratively defines the UI element(s). Based on selection of the particular declarative model data within the real-time incremental editor, each of the UI element(s) is visually distinguished within the first UI as being selected at the second UI of the real-time incremental editor.
  • A second editing user input is received at the second UI of the real-time incremental editor. The second editing user input provides one or more changes to the particular declarative model data corresponding to the UI element(s), in order to enact one or more desired edits to each of the UI element(s). The first UI is updated to reflect the desired edits to the UI element(s), including altering the visual appearance of each of the UI element(s) based on the changes to the particular declarative model data corresponding to the UI element(s).
  • This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
  • Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the invention. The features and advantages of the invention may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. These and other features of the present invention will become more fully apparent from the following description and appended claims, or may be learned by the practice of the invention as set forth hereinafter.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In order to describe the manner in which the above-recited and other advantages and features of the invention can be obtained, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
  • FIG. 1 illustrates a computer architecture in which embodiments of the present invention may operate including providing a real-time incremental editor for enacting changes on a user interface (UI) and presenting a preview of the potential results of a user edit to data model data.
  • FIG. 2 illustrates a flowchart of an example method for providing a real-time incremental editor for enacting changes on a user interface (UI).
  • FIG. 3 illustrates a flowchart of an example method for presenting a preview of the potential results of a user edit to data model data.
  • FIG. 4 illustrates an embodiment of the present invention in which a UI is dynamically updated.
  • FIG. 5 illustrates an embodiment of the present invention in which selected elements are highlighted in a running application
  • DETAILED DESCRIPTION
  • The present invention extends to methods, systems, and computer program products for providing a real-time incremental editor for enacting changes on a user interface (UI) and presenting a preview of the potential results of a user edit to data model data. A computer system instantiates a real-time incremental editor configured to receive editing inputs that are to be applied to data of a data model, where the data model data declaratively defines a UI model that drives the behavior of a corresponding UI based on the UI model. The computer system receives an editing input from a user indicating one or more desired edits that are to be made to the UI and determines, based on the received editing inputs, which changes are to be made to the data model data to enact the desired changes in the UI. The computer system updates the UI in real-time based on the determined edits to the data model data and presents the updated UI in real-time without losing state information in the resultant UI.
  • In another embodiment, a computer system instantiates a real-time incremental editor configured to receive editing inputs that are to be applied to data of a data model, where the data model data declaratively defines a UI model that drives the behavior of a corresponding UI based on the UI model. The computer system receives an input from a user at the instantiated editor, where the input indicates various edits that are to be applied to a running software application. The computer system determines, based on the received user input, which changes are to be made to the data model data to enact the desired changes in the running software application and previews the determined changes in real-time on the running application.
  • Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions, computer program products and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
  • Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
  • Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system. Thus, it should be understood that computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
  • Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
  • Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
  • FIG. 1 illustrates a computer architecture 100 in which the principles of the present invention may be employed. Computer architecture 100 includes editors A and B (110A and 110B, respectively). In some embodiments, only one editor may be used. In other embodiments, two or more editors may be used. Editors, as used herein, may refer to development environments in which a software developer may design or otherwise construct a software application or element thereof. The editors may comprise all different types of editors including text editors, graphical editors (e.g. tree/diagram editors) and other types of editors. The editors may provide a textual, graphical or other representation of a model. This model may be used as the basis for a software application (e.g. running application 135) or for the software application's user interface (111). Thus, in some cases, a developer may use an editor (e.g. 110A) to design or edit a model that forms the basis of user interface 111.
  • During UI development, the developer (or more generically, user 105) may desire to make changes to the UI of running application 135. Thus, user 105 provides editing inputs 106 which, when applied, will alter the UI. These editing inputs are received by the editor(s) and are processed to determine which changes are to be made to the UI to bring about the desired result. Here it should be noted that UI 111 may be a data-driven interface. Accordingly, the UIs may have a UI model which is based on a data model (e.g. model 120). Such data-driven interfaces may be changed in real-time (or essentially real-time) by changing the data (121) of the data model (120). Thus, update determining module 115 may be used to determine, based on the received determined changes 113 from the editor, which changes are to be made to the data to bring about the desired changes in the UI.
  • Update determining module 115 may also be configured to propagate any changes or edits made to editor A (110A) by user 105 to editor B (110B), and vice versa. Thus, edits from editor B 116 may be propagated to editor A and edits from editor A 117 may be propagated to editor B. In cases where other editors exist, edits may also be propagated to the other editors. The propagation of edits may occur because of a two-way binding between the update determining module 115 (a data flow engine in some embodiments) and the editors.
  • Upon determining which data changes are to be made to data model 120, update determining module 115 may output the UI model updates 123 that are to be made to the UI (111) of running application 135. Runtime model interpreter may receive the UI model updates and, in real-time, output the actual UI changes or updates 131 that are to be made to the UI. Thus, in this manner, UI 111 is updated dynamically, in real-time based on the editing inputs provided by the user 105. The changes may be viewed within the editor as the UI is altered based on the runtime model interpreter's outputs. These steps will be explained in greater detail below with regard to methods 200 of FIGS. 2 and 300 of FIG. 3.
  • In view of the systems and architectures described above, methodologies that may be implemented in accordance with the disclosed subject matter will be better appreciated with reference to the flow charts of FIGS. 2 and 3. For purposes of simplicity of explanation, the methodologies are shown and described as a series of blocks. However, it should be understood and appreciated that the claimed subject matter is not limited by the order of the blocks, as some blocks may occur in different orders and/or concurrently with other blocks from what is depicted and described herein. Moreover, not all illustrated blocks may be required to implement the methodologies described hereinafter.
  • FIG. 2 illustrates a flowchart of a method 200 for providing a real-time incremental editor for enacting changes on a user interface (UI). The method 200 will now be described with frequent reference to the components and data of environments 100 of FIGS. 1 and 400 of FIG. 4.
  • Method 200 includes an act of instantiating a real-time incremental editor configured to receive editing inputs that are to be applied to data of a data model, wherein the data model data declaratively defines a UI model that drives the behavior of a corresponding UI based on the UI model (act 210). For example, a computer system may instantiate real-time incremental editor 110A which is configured to receive editing inputs 106 that are to be applied to data 121 of data model 120. Data model data 121 declaratively defines a UI model that drives the behavior of UI 111 which is based on the UI model. As used herein, declaratively defining a model refers to allowing the user to design or specify the model using data, and not rigid logic. That data can then be analyzed and interpreted in real-time without needing to be compiled first. Accordingly, by having data model data 121 that declaratively defines a model, simple changes to the data model data can be applied in real-time without having to recompile or even stop a running software application that is based on the data model.
  • In some cases, data model 120 may be structured according to a schema. This may be referred to herein as a schematized model. Data model 120 may include many different types of data. This data may be stored in a local data store or may be stored remotely, perhaps even on a number of different computer systems such as on a storage area network (SAN). In some embodiments, data model data 121 may be divided into chunks. These chunks may be based on corresponding portions of a UI tree. For instance, a UI tree that has multiple different elements as part of the tree, may have a chunk of data corresponding to each element of the tree. In some cases, these chunks may comprise text chunks. The text chunks may be structured such that data model (schema) grammar that defines how the data 121 is to be structured is not violated. Accordingly, in such cases, bad or improperly functioning states may be avoided as the text chunks are structured in such a way as to prevent schema grammar violations.
  • In cases where data model data 121 is divided into text chunks, a two-way data binding may be applied between a textual editor view and the text chunks of the data model data, such that any edits to the data model data cause the textual editor to update automatically. Accordingly, in instances where editor A is a textual editor, editor A may have a two-way data binding between itself and the text chunks of the data model data 121. In such cases, any edits made to data model data 121 (e.g. as a result of editing inputs 106) cause the textual editor (editor A) to update automatically.
  • Method 200 includes an act of receiving an editing input from a user indicating one or more desired edits that are to be made to the UI (act 220). For example, editor A (110A) may receive editing input 106 from user 105 indicating one or more desired edits that are to be made to the UI. These edits may comprise text edits, graphical edits or any other type of changes or modifications that can be made to UI 111. Editor A may be configured to output determined changes 113 indicating the editor's interpretation of the user's editing inputs. For instance, if user 105 indicates in editing input 106 that a diagram element is to be changed, the editor may identify in changes 113 which element is to be changed. Thus, if the diagram element is to be moved, the editor would identify the current position of the element and the new (moved) position of the element. This is one example among many, many different changes that can be made. Accordingly, the example is intended merely to illustrate how changes 113 may be generated and should not be read as limiting the type or number or style of changes that can be made.
  • Method 200 also includes an act of determining, based on the received editing inputs, which changes are to be made to the data model data to enact the desired changes in the UI (act 230). For example, update determining module 115 may determine, based on received editing inputs 106 (and/or determined changes 113), which UI model updates 123 are to be made to data model data 121 (upon which UI 111 is built) to enact the desired changes in the UI. These UI model updates may comprise, for example, an indication of which elements are to be changed and by how much.
  • The UI model updates 123 may, at least in some cases, be transformed into UI changes using purely functional transformations. Because the UI is data-driven, data model data 121 may be changed in a purely functional transformation, without recompiling the UI or even stopping it to make the changes. In some embodiments, the UI model updates 123 indicating how the data model data 121 is to be transformed into UI changes 131 are data and can be changed, manipulated or transferred in the same way data is normally handled. Moreover, the data indicating how the data model data 121 is to be transformed into UI changes 131 may be mapped to a structured UI representation including text, a UI tree, a graphical diagram or any other type of structured UI representation.
  • In some embodiments, update determining module 115 may comprise a data flow engine. A data flow engine, as used herein, may refer to a software engine that supports functional composition and the use of queries for property-binding, repeated object construction, and conditional object construction. The data flow engine may be configured to receive and process functional application descriptions based on models. For instance, the data flow engine may be configured to process extensible markup language (XML) application foundation (XAF) applications or other functional application descriptions. Additionally or alternatively, the data flow engine may be configured to process queries constructed by the editors.
  • XAF, as used herein, is a software application framework for generating a runtime structure for a given software application and for managing execution of the software application. Applications may be generated by an application framework that includes a collection or graph of connected application components. Functionalities of an application configured according to the application framework are enabled by dynamically configuring groups of application components into component domains where each domain is configured to enable a given functionality of the application, for example, displaying a picture in a word processing document.
  • In order to generate a runtime structure for the application, an application may pass an application description for each piece of application functionality to an application description engine. The application description provides declarative rules for structuring and composing the component domains, and the application description engine is operative to interpret the declarative rules for creating and reconfiguring the component domains as required based on data events received by the application. Data events, as used herein, may be any type of software message, user input, or other indication that something has occurred that may affect the software application. Data events may occur, for example, when a user clicks a mouse to interact with the application, or when an error message is presented during processing of the application.
  • In some cases, XAF may be configured to track dependencies such that data evaluations and object graph constructions may be incrementally updated when data changes. For example, XAF may track dependencies in data such that a software application may be incrementally updated when the data is updated. In some embodiments, this incremental updating may be carried out automatically by a XAF engine (e.g. the data flow engine or update determining module 115).
  • Thus, in some embodiments, data model data 121 may comprises a schematized XAF application description. This XAF application description may, in turn, be interpreted by the XAF engine (or update determining module 115). Moreover, in some cases, at least one of the real-time incremental editors 110A/110B may be instantiated based on a XAF application description. It should be noted that while XAF engines and XAF application descriptions are provided as examples, any type of functional application description language or functional application description language engine may be used.
  • Method 200 includes an act of updating the UI in real-time based on the determined edits to the data model data (act 240). For example, runtime model interpreter 130 may receive UI model updates, interpret those model updates in real-time (or substantially real-time) and send the UI updates 131 back to the editor(s). These changes may be applied to the UI(s) dynamically in real-time, without stopping or recompiling the running UI. In some cases, only the minimal set of controls and properties (or other elements) are updated based on determined data dependencies. Accordingly, if update determining module 115 (or a data flow engine) determines that the determined data changes to element A, for example, affect elements C, D and F based on determined data dependencies, only elements A, C, D and F will be updated while the other elements will be left alone.
  • In instances where update determining module 115 comprises a data flow engine, the data flow engine may be configured to support transformations of collections, records and atoms (and any other forms of data model data) as well as track dependencies across data model data items. By tracking such data dependencies, the minimal set of elements to be changed may be discovered and other elements may be left unchanged. In another embodiment, a UI model interpreter may be instantiated to perform each the following steps in real-time: receive a declaratively defined UI model, interpret the UI model via functional transformation using the data flow engine and present the resulting UI to the user.
  • Returning to FIG. 2, method 200 includes an act of presenting the updated UI in essentially real-time without losing state information in the resultant UI (act 250). For example, running application 135 may present the visualized results of the updated UI 111 in essentially real-time without losing state information in the UI. Thus, if a user had previously selected a given element, that element would still be selected after the updates were dynamically applied. Moreover, it should be noted that wherever the terms real-time, essentially real-time or substantially real-time appear, these terms are referring to a short time frame and not to an instantaneous event. Accordingly, even in real-time, a delay may occur between the time an input is received and when that input is applied.
  • One example of displaying the visualized results of the updated UI may be shown in environment 400 of FIG. 4. Environment 400 includes an editor and a corresponding UI shown at two different times (i.e. editor at time 1 (T1) 410 with corresponding UI at T2 411 and editor at time 2 (T2) 420 with corresponding UI at T2 421). Editor at T1 includes a UI 411 that has three items: input box 412, button 413 and selectable/draggable object 414. A user may indicate that selectable/draggable object 414 is to be moved to a position above input box 412, that button 413 is to be deleted and that a menu box is to be added. These changes may be applied incrementally or may be applied as a group. Accordingly, although not shown in FIG. 4, in the above example a user may see each edit incrementally applied to the UI essentially as soon as the edit has been input by the user. At time T2, UI 421 may appear with each of the above changes applied: selectable/draggable object 424 has been moved to its new location, input box 422 has been moved to a lower position, button 413 has been deleted and menu 425 has been added to the UI. This example is merely intended to show one example of how UI elements may be updated in real-time. In other embodiments, different elements, different applications and different numbers of elements may be used.
  • In this manner, a user's edits may be analyzed to determine which changes are to be made to the user interface (or other software application) and applied to the user interface in real-time, while the UI is running, without stopping the application and without losing state information. In a similar vein, the potential results of a user's edits may be previewed within an editor, as explained below with regard to method 300 of FIG. 3.
  • FIG. 3 illustrates a flowchart of a method 300 for presenting a preview of the potential results of a user edit to data model data. The method 300 will now be described with frequent reference to the components and data of environments 100 of FIGS. 1 and 500 of FIG. 5.
  • Method 300 includes an act of instantiating a real-time incremental editor configured to receive editing inputs that are to be applied to data of a data model, wherein the data model data declaratively defines a UI model that drives the behavior of a corresponding UI based on the UI model (act 310). For example, a computer system may instantiate incremental real-time editor A (110A) which may be configured to receive editing inputs 106 that are to be applied to data 121 of data model 120. As described above, data model data 121 declaratively defines a UI model that drives the behavior of UI 111 which is based on the UI model. Because the UI model is declaratively defined, simple changes to the model data can be applied in real-time without having to recompile or even stop a running software application that is based on the data model.
  • Method 300 further includes an act of receiving an input from a user at the instantiated editor, the input indicating one or more edits that are to be applied to a running software application (act 320). For example, editor B (110B) may receive editing input 106 from user 105 indicating one or more desired edits that are to be made to the UI. These edits may comprise text edits, graphical edits or any other type of changes or modifications that can be made to UI 111.
  • Method 300 also includes an act of determining, based on the received user input, which changes are to be made to the data model data to enact the desired changes in the running software application (act 330). For example, update determining module 115 may determine, based on received editing inputs 106 (and/or determined changes 113), which UI model updates 123 are to be made to data model data 121 (upon which UI 111 is built) to enact the desired changes in the UI. These UI model updates may comprise, for example, an indication of which elements are to be changed and by how much.
  • Method 300 includes an act of previewing the determined changes in real-time on the running application (act 340). For example, running application 135 may be configured to preview the determined UI model updates 123 which are based on determined changes 113 which are in turn based on editing input 106. These UI model updates may be applied to a running UI by runtime model interpreter 130 which interprets the UI model (and changes thereto) at runtime and provides the actual UI updates which are displayed in the UI (111) of the running application (135). Thus, as a user applies inputs, the results of those inputs may be displayed in the running application.
  • In some embodiments, user 105 may select an element of the data model (i.e. a UI element) in the editor and the editor may display, in real-time, a highlight around each corresponding instance of the selected element on the running application. Accordingly, as shown in FIG. 5, editor 510 may present UI 515 to the user with various elements including a menu, input box and other items A, B, C and D. User 105 may select item B (i.e. selected item 516) and, upon selection, any corresponding items in the running application that are implemented based on item B may be highlighted. Thus, in running application 520, items B1, B2 and B3, each of which implement item B of UI 515, are highlighted. In this manner, the user can easily look at the running application and identify each of the items or areas where the selected item is being used. This can be helpful in debugging the running application. The other UI items of the running application (i.e. menu 521, item A 522A, item C 522C, item D 522D and input box 523 are not highlighted unless or until the item is selected in the editor.
  • The selected item may be identified within the running application using a query. Editor 510 may be configured to construct a query that is used to query for each instance of the selected element. The editor can then identify the items based on the response to the query. In some embodiments, a host environment may be configured to display the highlighting around each corresponding instance of the selected element on the running application. Additionally or alternatively, user inputs may be received at the editor that result in a new feature being added to the running software application. In such cases, the new features may be dynamically added to the running software application in real-time.
  • Accordingly, an environment may be provided in which a user may select a UI (or other) item and each corresponding instance of that selected item will be highlighted in the running application, without stopping and/or restarting the application. Moreover, a development environment may be provided where a user's editing inputs are analyzed to determine which changes are to be made to the user interface (or other software application) and are applied to the user interface in real-time, while the UI is running, without stopping the application and without losing state information.
  • The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims (20)

We claim:
1. At a computer system that includes one or more processors and system memory, a method for providing a real-time incremental editor for enacting changes to user interface (UI) elements of an active UI, the method comprising:
presenting a first UI of an executing application program based on a UI model that includes declarative model data that drives the behavior of the first UI, the UI model including particular declarative model data that declaratively defines a visual appearance of one or more UI elements of the first UI, each of the one or more UI elements being an instance of the particular declarative model data; and
concurrent to presenting the first UI of the application program:
presenting a second UI of a real-time incremental editor that is separate from the application program and that is configured to receive one or more editing inputs that are to be applied to the particular declarative model data of the UI model, to edit the first UI of the application program while the application program is executing;
receiving a first editing user input at the second UI of the real-time incremental editor, the first editing input selecting the particular declarative model data that declaratively defines the one or more UI elements;
based on selection of the particular declarative model data within the real-time incremental editor, visually distinguishing each of the one or more UI elements within the first UI as being selected at the second UI of the real-time incremental editor;
receiving a second editing user input at the second UI of the real-time incremental editor, the second editing user input providing one or more changes to the particular declarative model data corresponding to the one or more UI elements, in order to enact one or more desired edits to each of the one or more UI elements; and
updating the first UI to reflect the one or more desired edits to the one or more UI elements, including altering the visual appearance of each of the one or more UI elements based on the one or more changes to the particular declarative model data corresponding to the one or more UI elements.
2. The method as recited in claim 1, wherein visually distinguishing each of the one or more UI elements within the first UI as being selected at the second UI comprises drawing a highlight around each of the one or more UI elements at the first UI.
3. The method as recited in claim 1, wherein the declarative model data comprises a schematized extensible markup language application foundation (XAF) application description.
4. The method as recited in claim 1, wherein the declarative model data is divided into chunks based on corresponding portions of a UI tree.
5. The method as recited in claim 1, further comprising, based on the first editing user input, constructing a query that is used to query for each instance of a UI element corresponding to the particular declarative model data.
6. The method as recited in claim 1, further comprising:
receiving a third editing user input at the second UI of the real-time incremental editor that result in a new feature being added to the application program; and
dynamically adding the new feature to the application program in real-time.
7. The method as recited in claim 1, wherein updating the first UI to reflect the one or more desired edits to the one or more UI elements comprises previewing one or more changes to the one or more UI elements.
8. One or more hardware storage devices having stored thereon computer-executable instructions that, when executed by one or more hardware processors of a computer system, cause the computer system to provide a real-time incremental editor for enacting changes to user interface (UI) elements of an active UI, including the following:
presenting a first UI of an executing application program based on a UI model that includes declarative model data that drives the behavior of the first UI, the UI model including particular declarative model data that declaratively defines a visual appearance of one or more UI elements of the first UI, each of the one or more UI elements being an instance of the particular declarative model data; and
concurrent to presenting the first UI of the application program:
presenting a second UI of a real-time incremental editor that is separate from the application program and that is configured to receive one or more editing inputs that are to be applied to the particular declarative model data of the UI model, to edit the first UI of the application program while the application program is executing;
receiving a first editing user input at the second UI of the real-time incremental editor, the first editing input selecting the particular declarative model data that declaratively defines the one or more UI elements;
based on selection of the particular declarative model data within the real-time incremental editor, visually distinguishing each of the one or more UI elements within the first UI as being selected at the second UI of the real-time incremental editor;
receiving a second editing user input at the second UI of the real-time incremental editor, the second editing user input providing one or more changes to the particular declarative model data corresponding to the one or more UI elements, in order to enact one or more desired edits to each of the one or more UI elements; and
updating the first UI to reflect the one or more desired edits to the one or more UI elements, including altering the visual appearance of each of the one or more UI elements based on the one or more changes to the particular declarative model data corresponding to the one or more UI elements.
9. The one or more hardware storage devices as recited in claim 8, wherein visually distinguishing each of the one or more UI elements within the first UI as being selected at the second UI comprises drawing a highlight around each of the one or more UI elements at the first UI.
10. The one or more hardware storage devices as recited in claim 8, wherein the declarative model data comprises a schematized extensible markup language application foundation (XAF) application description.
11. The one or more hardware storage devices as recited in claim 8, wherein the declarative model data is divided into chunks based on corresponding portions of a UI tree.
12. The one or more hardware storage devices as recited in claim 8, further comprising, based on the first editing user input, constructing a query that is used to query for each instance of a UI element corresponding to the particular declarative model data.
13. The one or more hardware storage devices as recited in claim 8, further comprising:
receiving a third editing user input at the second UI of the real-time incremental editor that result in a new feature being added to the application program; and
dynamically adding the new feature to the application program in real-time.
14. The one or more hardware storage devices as recited in claim 8, wherein updating the first UI to reflect the one or more desired edits to the one or more UI elements comprises previewing one or more changes to the one or more UI elements.
15. A computer system, comprising:
one or more hardware processors; and
one or more storage devices having stored thereon computer-executable instructions that, when executed by the one or more hardware processors, cause the computer system to provide a real-time incremental editor for enacting changes to user interface (UI) elements of an active UI, including the following:
presenting a first UI of an executing application program based on a UI model that includes declarative model data that drives the behavior of the first UI, the UI model including particular declarative model data that declaratively defines a visual appearance of one or more UI elements of the first UI, each of the one or more UI elements being an instance of the particular declarative model data; and
concurrent to presenting the first UI of the application program:
presenting a second UI of a real-time incremental editor that is separate from the application program and that is configured to receive one or more editing inputs that are to be applied to the particular declarative model data of the UI model, to edit the first UI of the application program while the application program is executing;
receiving a first editing user input at the second UI of the real-time incremental editor, the first editing input selecting the particular declarative model data that declaratively defines the one or more UI elements;
based on selection of the particular declarative model data within the real-time incremental editor, visually distinguishing each of the one or more UI elements within the first UI as being selected at the second UI of the real-time incremental editor;
receiving a second editing user input at the second UI of the real-time incremental editor, the second editing user input providing one or more changes to the particular declarative model data corresponding to the one or more UI elements, in order to enact one or more desired edits to each of the one or more UI elements; and
updating the first UI to reflect the one or more desired edits to the one or more UI elements, including altering the visual appearance of each of the one or more UI elements based on the one or more changes to the particular declarative model data corresponding to the one or more UI elements.
16. The computer system as recited in claim 15, wherein visually distinguishing each of the one or more UI elements within the first UI as being selected at the second UI comprises drawing a highlight around each of the one or more UI elements at the first UI.
17. The computer system as recited in claim 15, wherein the declarative model data comprises a schematized extensible markup language application foundation (XAF) application description.
18. The computer system as recited in claim 15, wherein the declarative model data is divided into chunks based on corresponding portions of a UI tree.
19. The computer system as recited in claim 15, further comprising, based on the first editing user input, constructing a query that is used to query for each instance of a UI element corresponding to the particular declarative model data.
20. The computer system as recited in claim 15, further comprising:
receiving a third editing user input at the second UI of the real-time incremental editor that result in a new feature being added to the application program; and
dynamically adding the new feature to the application program in real-time.
US14/285,039 2009-11-16 2014-05-22 Dynamic editors for functionally composed ui Abandoned US20140258971A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/285,039 US20140258971A1 (en) 2009-11-16 2014-05-22 Dynamic editors for functionally composed ui

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US26170109P 2009-11-16 2009-11-16
US12/637,353 US8756515B2 (en) 2009-11-16 2009-12-14 Dynamic editors for functionally composed UI
US14/285,039 US20140258971A1 (en) 2009-11-16 2014-05-22 Dynamic editors for functionally composed ui

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/637,353 Continuation US8756515B2 (en) 2009-11-16 2009-12-14 Dynamic editors for functionally composed UI

Publications (1)

Publication Number Publication Date
US20140258971A1 true US20140258971A1 (en) 2014-09-11

Family

ID=44012250

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/637,353 Active 2030-08-24 US8756515B2 (en) 2009-11-16 2009-12-14 Dynamic editors for functionally composed UI
US14/285,039 Abandoned US20140258971A1 (en) 2009-11-16 2014-05-22 Dynamic editors for functionally composed ui

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/637,353 Active 2030-08-24 US8756515B2 (en) 2009-11-16 2009-12-14 Dynamic editors for functionally composed UI

Country Status (1)

Country Link
US (2) US8756515B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160283204A1 (en) * 2015-03-25 2016-09-29 Ca, Inc. Editing software products using text mapping files
US10073604B2 (en) * 2014-05-15 2018-09-11 Oracle International Corporation UI-driven model extensibility in multi-tier applications
WO2021013655A1 (en) * 2019-07-25 2021-01-28 Robert Bosch Gmbh System and method for gui development and deployment in a real time system

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120117497A1 (en) * 2010-11-08 2012-05-10 Nokia Corporation Method and apparatus for applying changes to a user interface
CN103425476B (en) 2012-05-25 2016-12-14 国际商业机器公司 For providing the method and system of data for the user interface of performance monitoring
US20140068747A1 (en) * 2012-08-31 2014-03-06 International Business Machines Corporation Automatic Completeness Checks of Network Device Infrastructure Configurations During Enterprise Information Technology Transformation
GB2523074A (en) * 2013-12-16 2015-08-19 Make Apps Better Ltd A method and system for modifying deployed applications
US10255044B2 (en) 2013-12-16 2019-04-09 Make Apps Better Ltd Method and system for modifying deployed applications
JP6287300B2 (en) * 2014-02-13 2018-03-07 コニカミノルタ株式会社 Application creation apparatus, application creation method, and application creation program
US10169313B2 (en) * 2014-12-04 2019-01-01 Sap Se In-context editing of text for elements of a graphical user interface
US9823906B2 (en) * 2016-03-31 2017-11-21 Sap Se Complementary model-driven and textual development using enforced formatting constraints
US10824800B2 (en) 2017-11-10 2020-11-03 Think Research Corporation System and method for designing and editing computerized electronic data-entry forms
JP7052620B2 (en) * 2018-07-30 2022-04-12 オムロン株式会社 Support equipment and support programs
CN110262791B (en) * 2019-06-05 2023-10-10 北京蓝亚盒子科技有限公司 Visual programming method and device, operator and readable storage medium
US11199955B2 (en) * 2019-10-02 2021-12-14 Palantir Technologies Inc. Enhanced techniques for building user interfaces

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5982370A (en) * 1997-07-18 1999-11-09 International Business Machines Corporation Highlighting tool for search specification in a user interface of a computer system
US20010000585A1 (en) * 1997-11-05 2001-05-03 Cruise Charles C. Durable, absorbent spunlaced fabric structures
US20060012999A1 (en) * 2004-07-16 2006-01-19 Coushaine Charles M Molded-in light emitting diode light source
US20080016164A1 (en) * 2006-06-23 2008-01-17 Rohit Chandra Method and Apparatus for Automatically Embedding and emailing User-Generated Highlights
US20090008975A1 (en) * 2007-02-01 2009-01-08 Meinhard Behrens Vehicle seat system and motor vehicle having a vehicle seat system
US8259124B2 (en) * 2008-11-06 2012-09-04 Microsoft Corporation Dynamic search result highlighting

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5347627A (en) * 1992-04-07 1994-09-13 International Business Machines Corporation Graphical user interface including dynamic sizing and spacing
EP0622729A3 (en) 1993-04-29 1995-02-01 Ibm A user interface builder for a user interface server.
US5652884A (en) * 1994-11-14 1997-07-29 Object Technology Licensing Corp. Method and apparatus for dynamic update of an existing object in an object editor
US7188340B2 (en) * 2002-06-28 2007-03-06 Microsoft Corporation Hybrid tree for mixed user interface elements and sequential data
US7584420B2 (en) * 2004-02-12 2009-09-01 Lockheed Martin Corporation Graphical authoring and editing of mark-up language sequences
US8677274B2 (en) * 2004-11-10 2014-03-18 Apple Inc. Highlighting items for search results
US7577907B2 (en) * 2004-11-15 2009-08-18 Sap, Aktiengesellschaft System and method for dynamically constructing synchronized business information user interfaces
IES20060273A2 (en) * 2005-04-08 2006-11-29 Jacques Marie Yann Et Lefebvre A dynamic user interface and a method for generating a dynamic user interface for interfacing with an electronic data repository storing a collection of data elements
US8046737B2 (en) * 2005-04-29 2011-10-25 Microsoft Corporation XML application framework
US8418132B2 (en) * 2005-04-29 2013-04-09 Microsoft Corporation Application description language
US7818736B2 (en) * 2005-09-14 2010-10-19 International Business Machines Corporation Dynamic update mechanisms in operating systems
US7904809B2 (en) * 2007-06-29 2011-03-08 Microsoft Corporation Model-based editors for dynamic validation
US7979847B2 (en) * 2007-09-28 2011-07-12 Microsoft Corporation Visual debugger for declarative/data-flow applications
US20090112570A1 (en) * 2007-10-26 2009-04-30 Microsoft Corporation Declarative model interpretation
US20090288069A1 (en) * 2008-05-15 2009-11-19 One Microsoft Way Dynamic Declarative Application Description
WO2011014772A1 (en) * 2009-07-31 2011-02-03 Citizenglobal Inc. Systems and methods for content aggregation, editing and delivery

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5982370A (en) * 1997-07-18 1999-11-09 International Business Machines Corporation Highlighting tool for search specification in a user interface of a computer system
US20010000585A1 (en) * 1997-11-05 2001-05-03 Cruise Charles C. Durable, absorbent spunlaced fabric structures
US20060012999A1 (en) * 2004-07-16 2006-01-19 Coushaine Charles M Molded-in light emitting diode light source
US20080016164A1 (en) * 2006-06-23 2008-01-17 Rohit Chandra Method and Apparatus for Automatically Embedding and emailing User-Generated Highlights
US20090008975A1 (en) * 2007-02-01 2009-01-08 Meinhard Behrens Vehicle seat system and motor vehicle having a vehicle seat system
US8259124B2 (en) * 2008-11-06 2012-09-04 Microsoft Corporation Dynamic search result highlighting

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10073604B2 (en) * 2014-05-15 2018-09-11 Oracle International Corporation UI-driven model extensibility in multi-tier applications
US20160283204A1 (en) * 2015-03-25 2016-09-29 Ca, Inc. Editing software products using text mapping files
US9690549B2 (en) * 2015-03-25 2017-06-27 Ca, Inc. Editing software products using text mapping files
WO2021013655A1 (en) * 2019-07-25 2021-01-28 Robert Bosch Gmbh System and method for gui development and deployment in a real time system

Also Published As

Publication number Publication date
US8756515B2 (en) 2014-06-17
US20110119605A1 (en) 2011-05-19

Similar Documents

Publication Publication Date Title
US8756515B2 (en) Dynamic editors for functionally composed UI
AU2018281314B2 (en) Integrated system for rule editing, simulation, version control, and business process management
Lerman Programming Entity Framework: Building Data Centric Apps with the ADO. NET Entity Framework
Palanque et al. Formal methods in Human-computer interaction
JP5710852B2 (en) A framework for seamless authoring and editing of workflows at design and runtime
JP2022166019A (en) User interface for preparing and curating data for following analysis
JP5173128B2 (en) A unified model for authoring and executing flow-based and constraint-based workflows
US8719773B2 (en) Workflow data binding
Gregory et al. Java persistence with hibernate
US20090288069A1 (en) Dynamic Declarative Application Description
JP2012508934A (en) Data object management and automatic linking
JP2006107479A (en) Framework for modeling cross-cutting behavioral concerns inside work flow region
US20170185612A1 (en) Dynamically designing web pages
US8201147B2 (en) Generic XAD processing model
Powers et al. Microsoft visual studio 2008 Unleashed
Henkel et al. Pondering on the key functionality of model driven development tools: The case of mendix
US20090112570A1 (en) Declarative model interpretation
Himschoot Blazor Revealed
Fowler et al. Reactive single-page applications with dynamic dataflow
Shimomura Visual design and programming for Web applications
Sheldon et al. Professional Visual Basic 2012 and. NET 4.5 Programming
Kimmel Professional DevExpress ASP. NET Controls
Taher Hands-On Object-Oriented Programming with C#: Build maintainable software with reusable code using C
Tran UI generation from task, domain and user models: the DB-USE approach
Litt Building Personal Software with Reactive Databases

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JAYADEVAN, SIDDHARTH;HACKMANN, CHRISTOPHER DONALD;DANTON, STEPHEN M.;AND OTHERS;SIGNING DATES FROM 20091208 TO 20091211;REEL/FRAME:032951/0089

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034747/0417

Effective date: 20141014

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:039025/0454

Effective date: 20141014

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION