WO2015063675A1 - Data processing - Google Patents

Data processing Download PDF

Info

Publication number
WO2015063675A1
WO2015063675A1 PCT/IB2014/065652 IB2014065652W WO2015063675A1 WO 2015063675 A1 WO2015063675 A1 WO 2015063675A1 IB 2014065652 W IB2014065652 W IB 2014065652W WO 2015063675 A1 WO2015063675 A1 WO 2015063675A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
model
meta
entity
node
Prior art date
Application number
PCT/IB2014/065652
Other languages
French (fr)
Inventor
Jacques CASSAGNABERE
Wajira WEERASINGHE
Original Assignee
WARNAKULASURIYA, Dishan, Amrit, Jitendrakumar
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WARNAKULASURIYA, Dishan, Amrit, Jitendrakumar filed Critical WARNAKULASURIYA, Dishan, Amrit, Jitendrakumar
Publication of WO2015063675A1 publication Critical patent/WO2015063675A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/20Software design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/20Software design
    • G06F8/24Object-oriented

Definitions

  • the present invention relates to methods and apparatus for data processing.
  • data processing systems may require updates to maintain or improve the processing of data. Such updates may for example add additional functionality where desired, may improve existing functionality, or may provide for integration of separate systems into the data processing system.
  • Data processing systems may for example comprise functionality that is provided and managed by a single entity or may include functionality provided by a third party service provider, which may for example be located at a separate physical location. Integration of different systems presents various challenges, for example relating to consistency of how data is communicated between different systems.
  • ESB Enterprise Service Bus
  • a common messaging model is provided to allow systems to be integrated.
  • system integration is provided by way of a bus that receives data from systems in the form of messages, processes the messages and outputs messages to other systems.
  • the ESB allows systems to communicate with one another, the ESB is complex to maintain and configure and does not provide sufficient flexibility for data processing systems to be easily updated. There therefore remains a need for improvements in data processing.
  • a method of processing data comprising: receiving data to be processed; receiving a model representing the data and relationships between the data, the model defining at least one rule associated with data represented by the model; and processing the received data based upon the at least one rule defined by the model.
  • the processing can be controlled by the model.
  • Controlling processing of data based upon a model in this way allows aspects of data processing to be designed in advance and controlled by the model.
  • the model allows for example multiple systems to process the data in a consistent way. Additionally the model can allow the data processing to be modified in a straightforward way.
  • the validity data may comprise a time, wherein determining an instance of the data to be processed based upon the validity data associated with each instance of the data comprises determining a relationship between the time associated with instances of the data and a predetermined time.
  • the model may model a plurality of hierarchical data structures and relationships between the hierarchical data structures. The model therefore allows objects and relationships of a system to be pre-defined and processing to be controlled based upon the objects and relationships.
  • the at least one rule may define properties of the data.
  • the at least one rule may define the form of the data and required fields in the data.
  • the model may for example allow a data structure to be instantiated based upon the model.
  • the at least one rule may define properties associated with processing of the data.
  • the at least one rule may define access permissions for the data and processing that is permitted to be carried out on the data.
  • the model may define a plurality of groups, wherein each group is associated with one or more data items represented by the model. Each group may define a logical isolation from other ones of the plurality of groups.
  • the groups may for example define collections of data items and/or objects associated with the data processing system that have a functional or logical relationship. Grouping data items in this way allows modifications to be made to a group in a straightforward way.
  • the model may comprise a plurality of entities, the plurality of entities together representing the data and relationships between the data.
  • the model may comprise a plurality of instances of an entity, each instance of the entity comprising data defining a version associated with the entity.
  • a first plurality of entities may represent data and relationships between the data and a second plurality of entities may define rules associated with the data and processing of the data.
  • Each of the second plurality of entities may have a relationship with at least one entity of the first plurality of entities, each relationship providing an association between a first entity and a second entity, wherein data represented by a first entity is processed based upon rules defined by second entities having a relationship with the first entity.
  • the first plurality of entities may for example correspond to objects of a system and the second plurality of entities may provide meta data associated with the objects of the system.
  • the meta data may provide rules associated with the data for processing.
  • Each entity may comprise data associated with creation and/or modification of the entity.
  • the data associated with creation and/or modification of the entity may comprise data associated with a property selected from the group consisting of: a date the entity is created, a user that created the entity, a date the entity was last updated and a date that the entity was last used.
  • the method may further comprise receiving a further model, the further model representing data processing functionality and rules associated with the data processing functionality, wherein processing the received data comprises: receiving data indicating data processing functionality to be performed on the received data; and processing the received data based upon the received data indicating data processing functionality and based upon rules associated with the data processing functionality.
  • the further model may therefore provide further rules associated with the system and provide further control and integration of data processing systems.
  • the method may further comprise receiving updated model data, and modifying the model representing the data and relationships between the data based upon the updated model data, wherein modifying the model comprises: storing an instance of the model corresponding to the received model; and generating an instance of the model based upon the received model and the updated model data. That is, instances of the model may be stored whenever the model is modified. In this way the model instances allow data auditing of the model and the facility to undo any changes in a straightforward way.
  • the model may model a plurality of qualifiers associated with data represented by the model.
  • the qualifiers may for example further specify properties of the data.
  • a method of processing data at a plurality of data processing systems comprising: processing first data at a first data processing system according to any preceding claim to generate first processed data; and processing the generated first processed data at a second data processing system according to any preceding claim.
  • the second aspect of the invention therefore uses the model of the first aspect of the invention to provide consistency between different systems and allows multiple systems to be integrated and to process the same data. Improved integration of the systems is therefore provided.
  • Aspects of the invention can be implemented in any convenient form. For example computer programs may be provided to carry out the methods described herein. Such computer programs may be carried on appropriate computer readable media which term includes appropriate non- transient tangible storage devices (e.g. discs). Aspects of the invention can also be implemented by way of appropriately programmed computers and other apparatus. The invention may for example be carried out using either a single computer system/server or multiple computer systems/servers and other tools can be used in the form of Database Servers, ESBs, Application Servers and Web Servers etc. with the support of programs written based on the described model/methods.
  • Figure 1 is a schematic illustration of an example data processing system
  • Figure 2 is a schematic illustration of a computer of the plurality of systems of Figure 1 ;
  • Figure 3 is a schematic illustration of a meta-model showing entities of the model as well as relationships between the entities
  • Figure 4 is a schematic illustration of references defined between entities within the data model
  • Figure 5 is a schematic illustration of use of references between entities of the meta-model at run-time
  • Figure 6 is a schematic illustration of a structure for display of components with multiple levels
  • Figure 7 is a schematic illustration of a hierarchical structure of qualifiers in the meta-model of Figure 3;
  • Figure 8 is a schematic illustration of the process model in further details;
  • Figures 9a and 9b schematically illustrate addition of a meta-data element to a meta-node of the model of Figure 3;
  • Figure 10 is a schematic illustration of deletion of meta-data from a meta-node of the model of Figure 3;
  • Figure 11 is a schematic illustration of translation of data using the models described herein;
  • Figure 12 is a schematic illustration of an accessibility model;
  • Figure 13 is a schematic illustration of a data processing system using the meta-model
  • Figure 14 is a schematic illustration of Isolated instances.
  • the Business Process Meta-Model has a top level isolation making it possible to create multiple, independent, self-contained, self- functioning instances;
  • Figure 15 specifies a meta-structure arrangement to represent data structures in an instance
  • Figure 16 is a schematic illustration of Hierarchical Data Structures using Meta Nodes
  • Figure 17 is a schematic illustration of Meta-data representation of a sample meta-node
  • Figure 18 shows Meta-node to Node and Meta-Data to Data relationship
  • Figure 19 shows meta-qualifier and qualifiers connected to a specific meta-node by providing specialization to the data nodes that meta-node represents;
  • Figure 20 shows the usage of qualifier to qualify a node by providing specialization
  • Figure 21 shows the recursively repeating meta-node arrangement and notation for such
  • FIG 23 shows part of the Process Model of Figure 8.
  • an internal data processing system 1 comprises one or more systems 2, 3, 4, each system of the one or more systems 2, 3, 4 providing computing infrastructure comprising hardware and/or software.
  • Each of the plurality of systems may for example be associated with a respective business function and may provide computing functionality to the respective business function including data storage, processing and management.
  • the one or more systems 2, 3, 4 may be arranged to communicate with other ones of the one or more systems such that data may be processed using the plurality of systems. It will be appreciated that additional systems and communication between systems may be provided.
  • Additional computing infrastructure comprising hardware and/or software may additionally be provided by third party systems 5, 6.
  • the third party systems may for example comprise computing services that are outsourced to third party providers such as data storage, management and processing.
  • Figure 2 shows a computer 7 on which systems 2, 3, 4, 5, 6 of Figure 1 may operate in further detail.
  • the computer comprises a CPU 7a which is configured to read and execute instructions stored in a volatile memory 7b which takes the form of a random access memory.
  • the volatile memory 7b stores instructions for execution by the CPU 7a and data used by those instructions. For example, in use, data that is communicated between the systems of Figure 1 may be stored in the volatile memory.
  • the computer 7 further comprises non-volatile storage in the form of a hard disc drive 5c. Data such as data communicated between the systems 2, 3, 4, 5, 6 may be stored on hard disc drive 7c.
  • the computer 7 further comprises an I/O interface 7d to which are connected peripheral devices used in connection with the computer 7. More particularly, a display 7e is configured so as to display output from the computer 7. Input devices are also connected to the I/O interface 7d. Such input devices may include a keyboard 7f and a mouse 7g which allow user interaction with the computer 7. It will be appreciated that the computer may have other input interfaces.
  • a network interface 7h allows the computer 7 to be connected to an appropriate communications network so as to receive and transmit data from and to other computers of the systems 2, 3, 4, 5, 6.
  • the CPU 7a, volatile memory 7b, hard disc drive 7c, I/O interface 7d, and network interface 7h are connected together by a bus 7i.
  • Figure 3 shows a data model of entities associated with data of a data processing system such as the data processing system of Figure 1 .
  • the model shown in Figure 3 should be understood as illustrative only, and it will be appreciated that entities may be modelled in any convenient way.
  • the model comprises an instance component 301 that provides a top level isolation for instances of the model and that allows multiple instances of the model to be created.
  • the model of Figure 3 further comprises tables defining one or more hierarchical data structures.
  • the model includes node and data entities that correspond to run-time data related to objects (such as Employee record, Skill record, Incident Record, Insurance Claim Request Record etc.), and meta entities that provide a definition of associated objects of the data processing system.
  • objects such as Employee record, Skill record, Incident Record, Insurance Claim Request Record etc.
  • meta entities that provide a definition of associated objects of the data processing system.
  • each hierarchical data structure comprises a meta-structure 302 and one or more nodes 303.
  • Each of the one or more nodes 303 corresponds to an entity that is desirable to be represented in the model of the data processing system.
  • nodes 303 may be associated with employees, skills and hobbies in a data processing system that includes data associated with a business and that models employees of that business. It will of course be appreciated that nodes 303 may represent any data that is desirable to be processed.
  • Nodes 303 may be defined so as to define a multi-level hierarchical data structure. For example, as noted above, nodes may be associated with employees, skills and hobbies, with an employee node having skill and hobby child nodes.
  • the meta structure 302 has a one to many relationship with nodes 303 and provides a grouping of nodes 303 in the data processing system to provide logical separation of entities represented by the nodes 303.
  • each meta-structure may be associated with a plurality of associated entities of the data processing system.
  • the logical separation of entities may for example be based upon behaviour and/or purpose of the entities of the data processing system.
  • Each node 303 has a one to many relationship with one or more data structures 304 that each stores data associated with the node 303.
  • Each data structure 304 has a relationship with a metadata structure 305 that provides rules associated with related data structures 304.
  • the rules provided by metadata structures 305 may for example define rules associated with the data such as data type and format rules, as well as default values.
  • the metadata structures 305 may additionally or alternatively define rules associated with representation of data associated with a data structure 304 such as an order in which data is displayed by software arranged to process data in the data processing system.
  • Each metadata structure 305 data structures 304 may specify permitted references between data and nodes as illustrated in Figures 4 and 5.
  • an insurance claim record may have a data field called customer which refers to an actual customer who is doing the claim.
  • Both the insurance claim record and customer record are business objects/entities one referring to the other.
  • the metadata structure may therefore define a permitted reference between the customer field of a node and a customer node of the meta-model.
  • Each node 303 has a relationship with a meta node 306, which stores information associated with related nodes 303 defining how associated nodes are created at run-time and how the data associated with the nodes may be processed.
  • Each meta node 306 may for example store data indicating a maximum and minimum number of instances of an entity represented by a node 303 that may be created. The meta node 306 may therefore be used during the processing of data using the data processing system to ensure that a maximum number of instances of an entity has not been exceeded before a further instance of an entity is created.
  • the meta node therefore provides a design for the data within the system that may be defined during an initial design phase.
  • the meta node 306 may additionally or alternatively be used in the display and processing of data associated with nodes in a similar manner to the metadata structure 305.
  • the meta node 306 may specify permissions for functions such as visibility, rendering, reading and writing of data associated with a node 303 that has a relationship with a meta node 306.
  • the meta node 306 may additionally or alternatively specify a structure for display of components, for example as illustrated in Figure 6.
  • an insurance claim form having 3 sections, a-Requester Information, b-Approver Information, c-Accounts Managers section, may be created using respective meta-nodes based upon information defined in a meta-node.
  • the meta-node may for example define permissions for different users of the system to control visibility, editing etc. When rendering the claim form to the end user, these sections can be enabled, disabled or hidden based on a user's role and permission for the role as described below.
  • Each meta node 306 has a relationship with one or more metadata structures 305 such that a meta node 306 may specify rules associated with metadata structures.
  • the meta node defines characteristics and behaviour of a business object, whereas the metadata structure defines individual data elements or fields in the business object.
  • Each node 303 may additionally have a relationship with a qualifier 307 that associates one or more properties or types with a node 303.
  • Each qualifier 307 has a relationship with a meta qualifier 308 that provides data associated with each associated qualifier 307 in a similar manner to meta nodes 306 and metadata structures 305.
  • Qualifiers 307 can be defined in a hierarchical relationship in a similar manner to nodes, as illustrated in Figure 7.
  • Each meta qualifier 308 has a relationship with a meta node 306 that may define properties of the meta qualifier 308.
  • Each meta qualifier provides a grouping of qualifiers available for an associated meta node and defines properties and behaviour of the group of qualifiers.
  • a user can select a qualifier and associate the selected qualifier with a node to qualify the node. For example, when creating an employee John it is possible to select a qualifier for the employee such as a qualifier of three qualifiers "Internal", "External” or “Contracted”. These qualifiers are grouped and connected to the Employee meta node using the appropriate meta qualifier.
  • Each element of the model of Figure 3 has a key gid that uniquely identifies the associated element of the model and allows the data processing system to automatically identify entities of the model that are associated with run-time data.
  • each of the meta structure, meta node, meta qualifier and qualifier elements include a FunctionallD field that identifies each of the meta elements of the model and that can be used during design and configuration of the model, for example to provide references between entities of the model.
  • a special attribute Contribute To Key may be used to indicate entities that are used in the generation of the FunctionallD field for a meta structure.
  • an Employee object may be created by including an employee meta-node in the model and creating data element definitions or fields such as PartnerlD, EmployeelD, Employee Name and Age etc. using meta-data entities in the model.
  • a key field may be used to identify each instance of the employee entity based upon a composite key of PartnerlD+EmployeelD data fields.
  • the PartnerlD and Employee ID fields may therefore be marked as "ContributeToKey” such that the fields are used to calculate an identifier key for the data records.
  • an employee Bob may have FunctionallD 1 -1 while John may have FunctionallD 2-1 which means both of them have the same EmployeelD 1 but two different partners such that a unique key is generated for each employee using a composition of keys.
  • Data auditing values are defined by the model such that any data as well as any rules that are defined relating to the data that is created or processed based upon the model can be tracked.
  • each entity of the model specifies "date created", “created by”, “last update date” and "last used” fields that specify data associated with creation and update of entities of the model. The data auditing values can therefore be used to determine when each entity of the model was created and/or updated and also who created and/or updated the entity.
  • the model additionally includes fields indicating validity of all entities of the model.
  • the fields indicating validity of entities of the model is based upon "valid from” and "valid to" fields defined within each entity of the model.
  • the fields indicating validity of entities of the model may be processed to determine whether an entity is currently valid.
  • multiple instances of each model entity and also multiple instances of data associated each model entity may be created and stored, and the data processing system can determine which entity and/or instance of an entity should be used as the basis for data processing using the fields.
  • instances of data and rules can be created within a data processing system and implemented automatically as desired, thereby allowing improved continuity of data processing. That is, by allowing multiple instances to be created and a run time instance to be changed for a new instance by change of a single data field, updates to the data and rules used in the processing of data can be implemented in a straightforward and real time manner.
  • an instance of the model of Figure 3 is created based upon the rules specified in the meta-elements of the model.
  • an instance of the model of Figure 3 may be instantiated in a database.
  • the node and data entities of the model store data that is processed in a data processing system and the data processing system may process the data in accordance with rules specified in the meta entities of Figure 3.
  • an entity of the model may be created, as described above.
  • the model may be used to determine which instance of an entity of the model is the currently valid instance and processing may be performed accordingly. For example, a plurality of instances of an employee node may be created that corresponds to a particular employee and data within the model allows selection of the appropriate instance of the employee node for the employee.
  • the meta structures of the model of Figure 3 may additionally be used to define how data is displayed in a data processing system, and how data that is input by a user is processed in response to the user input.
  • the meta Structure, meta node and meta data along with meta qualifiers and qualifiers can be used to render data in a data input screen.
  • this information is used to dynamically render the data entry screen using appropriate software/programming tools, users can navigate to such screens and enter data which the data will be stored in the form of Node 303 and Data 304 records, possibly based upon appropriate processing of the data.
  • Figure 8 shows a process model of entities associated with a data processing system such as the data processing system of Figure 1 . Whilst the model of Figure 3 generally represents data of a data processing system, the model of Figure 8 shows relationships between data processing operations using the model of Figure 3. In particular, it can be seen that node 303 and meta node 306 of Figure 3 is shown in Figure 8.
  • Each node 303 has a relationship with a node status 801 that stores a status 802 for a node.
  • the node status 801 allows statuses of nodes to be determined and also allows validity of a node to be modified based upon fields of the node status 801 , for example indicating an elapsed time period since a node was set to a status and maximum duration fields defining a maximum time that a status is valid for. Where it is determined that a status is no longer valid an alternate path trigger may be executed modify processing associated with a node in the data processing system.
  • Each status 802 has a relationship with meta node 306 associated with node 303 such that the meta node 306 defines possible statuses for node 303.
  • an insurance claim request record associated with a meta-node may have possible statuses associated with the meta node of Requested, Pending for Evaluation, Approved, Completed Payment etc.
  • those nodes can have an associated status selected from the possible statuses associated with the meta-node.
  • Each time a status is changed the change is stored in node status 801 .
  • the record with the largest SequenceNumber is the current status for that node representing the current status of the insurance claim.
  • Each status is associated with a trigger 803.
  • Each trigger 803 may be associated with an event such as user input, or an event associated with the data processing system.
  • Each trigger 803 has a relationship with one or more business rules 804 which define rules associated with a trigger.
  • each business rule 804 associated with the trigger is processed.
  • Each trigger 803 may additionally have a relationship with one or more transactions 803a that may define security and/or permissions associated with a trigger 803.
  • Each business rule 804 may have a relationship with one or more conditions 805 that each comprise a logical condition that is required to be satisfied for a business rule 804 to be executed.
  • Conditions 805 may in turn be associated with cases 806 which provide further subdivision of conditions into logical conditions.
  • each condition 805 may be based upon a plurality of cases 806 which are the smallest logically evaluable segment.
  • a case may for example contain two operands and an operation to evaluate to contribute to the condition it belongs to.
  • Cases 806 may have a sequence number field that may be used to determine an order of evaluation of cases.
  • Each case has 806 may for example include Varl JD and Var2_ID fields which can be set as a first input parameter, which may for example be associated with a parameter of data 304 of the model of Figure 2 or may be fixed values.
  • Each case 806 may be associated with an operator 807 defining a logical operator associated with the case 806.
  • a business rule 804 is determined to be satisfied, for example based upon the conditions 805, cases 806 and operators 807 associated with the business rule 804, then one or more operations may be performed based upon one or more macros 808 associated with the business rule 804.
  • Each macro 808 has a relationship with a standard routine 809 and a relationship with one or more macro parameters 810.
  • Each standard routine 809 is associated with an operation that may be executed by the data processing system and defines rules associated with execution of the operation and each macro parameter 810 defines rules associated with values for each macro 808.
  • Each macro parameter 810 defines rules associated with values for each macro 808.
  • a variable ID field allows the macro parameter 810 to specify whether a value associated with the macro parameter 810 is a reference to an instance of a data entity or a fixed literal. Where a variable class is a fixed value, the fixed value may be provided by a variable value field of the macro parameter entity 810.
  • a standard routine entity may be used to define a relationship between the macro parameter and an instance of a data entity 304 of Figure 3 that is used in the execution of an operation associated with the macro.
  • a variable class field allows the macro parameter 810 to specify whether a value associated with the macro parameter 810 is a reference to an instance of a data entity or a fixed literal. Where a variable class is a fixed value, the fixed value may be provided by a variable value field of the macro parameter entity 810.
  • 81 1 allows properties of data to be defined for standard routines that is used in each instance of a standard routine.
  • Each entity of the process model of Figure 8 may additionally include data auditing fields and fields associated with validity of entities of the process model in a corresponding manner to that described above with reference to Figure 3 to provide data auditing of the process model and to allow modifications of the model to be made and tracked as described above.
  • the process model of Figure 8 therefore allows processing performed by the data processing system to be carried out in accordance with specified rules defined within the process model.
  • the model may be used to provide sharing of best practices between data processing systems.
  • the models of Figure 3 and Figure 8 define rules relating to how data is defined and how the data should be processed.
  • the models may therefore be used to instantiate data in a different but related data processing system based upon a configured data model such as the data model of Figure 3 and processing of the data may be controlled based upon a configured process model such as the process model of Figure 8.
  • Figures 9a and 9b show addition of a field "Contact Details" to a data structure.
  • Figure 9a shows a customer meta-node 306of Figure 3 before addition of a field
  • Figure 9b shows a corresponding customer meta-node after addition of the field.
  • a new instance of a meta data structure associated with the meta-data that has a relationship with the meta-node is created in the model that is identical to the previous instance of the meta data structure but that specifies that data associated with the meta node has the additional required field and that has an effective date that specifies when the new field becomes part of the data.
  • the data processing system checks the validity of the corresponding meta-data entity in the model and determines from the model that after 20/1/201 1 the data includes the new field "Contact Details". Processing is then performed based upon the rules associated with the new instance of the customer meta-node defined in the model. All the meta-data related to the previous meta- node will exist in the model and in the database as they were. Processes associated with the meta-data may additionally be updated such that the new data fields are processed, for example by modifying associated processes in the process model.
  • Figure 10 shows deletion of an address field associated with a customer node.
  • a new instance of the meta data structure associated with the data is generated with the desired modification and with dates specifying when the modification becomes effective.
  • the previous instance of the meta data structure is also updated to indicate when the data structure ceases to be effective. After the effective date the data processing system is able to determine from the model the modified fields of the data. Editing of a meta data structure may be carried out by creating a new entity in the model and marking the previous entity as deleted.
  • the data model may be used to provide data consistency within a run-time data structure. In particular, the data model specifies properties of data that are required in the run-time data structure and data can therefore be audited based upon the rules specified in the data model.
  • each instance of a table of a database may be checked against corresponding entities of the data model and the associated rules defined in the data model. Any inconsistencies that are identified can be corrected based upon the rules specified in the data model, for example using default values that may be specified in the data model.
  • the meta entities of the model allow changes to be made to the definitions of the data and any changes may be checked against existing data.
  • the meta entities may be used to modify existing data where inconsistencies are identified to ensure that the data processing system may continue to process data in the system.
  • All changes may be made to instances of the data that are not currently being processed by the data processing system based upon the models by specifying a future validity date for the corresponding entities in the model such that modifications can be made without affecting the running of the data processing system, and the changes can be caused to be automatically implemented at the appropriate time based upon the validity criterion discussed above.
  • the language and translations entities define relationships between data and languages, and allows data to be provided to a user in a selected language automatically based upon the model.
  • the translation entity stores identifiers and types of data, together with translations for the data that may be retrieved and provided as output to a user.
  • a user's preferred language may for example be stored in a user entity described below.
  • the meta data structure may store data indicating whether a particular may be translated, and the languages that may be used.
  • Auditing may be performed on the data by the data processing system because all instances of the model are stored and may be accessed. As all the data changes are stored in copies of nodes and data, they also will be available to track the history of data changes. A report can be produced to list all the changes performed on nodes and data together with information such as Created By, Created Date, Updated By, Last Updated Date etc. which are default fields for any entity of the model including meta elements such as meta-node, meta-data etc. and process elements such as task, business rule, macro etc.
  • a further accessibility model may be provided that provides data associated with users of the system and that defines permissions for users to access the data of the data model and processes of the process model.
  • An example accessibility model is shown in Figure 12.
  • a user entity has a relationship with one or more User Profile entities such that a single user may have more than one associated user profile.
  • Each user profile is associated with at least one role.
  • the relationships between users and profiles allows profiles to be grouped according to permissions available to the profiles based upon roles associated with the profiles. The permissions may be used to determine whether a user associated with the data processing system is permitted to carry out a particular transaction, which is associated with a role.
  • Transactions of the accessibility model correspond to transactions 803a of the process model of Figure 8 such that the accessibility model in combination with the process model determines what operations are permitted to be carried out on the data by a particular user of the data processing system.
  • a set of Application Programming Interfaces may be defined to provide the functionality of a data processing system based upon a run-time data structure.
  • Such an API can be provided to enable communication between a run-time data structure and processes based upon the data and process models and external services.
  • Such an API may provide functionality for obtaining data from the run-time data structure and for writing data to the run-time data structure ensuring that the rules defined by the models described above are obeyed.
  • Using an API in this way allows integration of data processing systems, including external systems and legacy systems, in to a common platform that is controlled using the data and process models.
  • User interface screens may additionally be provided to allow access to the run-time data.
  • the user interfaces may for example be provided by way of forms (using data structures) such as Incident, Problem Form, Configuration, Issue Tracking, Employee Time Sheet Entry. Display of data within forms may be controlled based upon the data and process models as described above.
  • the process model may be used to control workflows within the data such as controlling forms that are displayed in response to triggers as described above, with display being controlled within the framework provided by the data and process models.
  • FIG. 13 a schematic illustration of a generation and use of a data processing system using the models described above.
  • a design function 1301 provides an environment for design of the models for use in a specific data processing system.
  • the design function 1301 provides functionality allowing the design of instances of the models described above that have specific data and rules for a particular data processing system.
  • Design function 1301 may be implemented using appropriate software/hardware/tools to create and maintain the meta-model (meta level elements) to create business structures and processes as well as configuration of the other elements such as ESB and APIs etc.
  • Data store functionality 1302 stores data that is to be processed together with the models described above. As described above data stored in the database has a relationship with the entities of the models and the data is processed based upon the model and the relationship between the data and the entities of the model. For example, each instance of a node and data structure entity of the model may have a corresponding data item in the data to be processed and the relationship between the node and data structure and the data is known.
  • the data store functionality may be used to store all the information required for the data processing system to function in the form of a database specified in figures 3, 8, 1 1 and 12 in the form of a relational database. Any suitable database technology can be used to generate the data store using the meta-model.
  • a runtime environment 1303 provides an environment in which data of the data store functionality 1302 may be processed based upon the models and additionally provides functionality to allow external systems 1304 to participate in the data processing.
  • the runtime environment 1303 may for example incorporate an enterprise service bus to provide communication between systems that participate in the data processing.
  • the runtime environment 1303 executes processes, business rules, provides user interface, provide connectivity to other external system etc.
  • This system can be created using appropriate software/hardware/tools such as programmed modules, ESBs, Web Servers, Application Servers etc.
  • the data store functionality includes a data access layer 1305 that provides an interface between the stored data and the design functionality and runtime environment.
  • the data access layer may for example provide one or more APIs for communication between the stored data and data processing systems.
  • the data to be processed and data defining the models may be stored for example in one or more databases 1306.
  • the models described above may therefore provide a consistent framework for integrating data processing across multiple systems, which systems may be purpose designed or may be existing systems that interface with the runtime environment 1303 to provide functionality to the data processing system. By basing data processing on models that define rules for how the data may be processed, the systems are able to be integrated in a straightforward way.
  • the models allow multiple instances of data structures to be created and to exist simultaneously, updating of the data processing system may be provided in a straightforward and consistent way without the need to stop processing within the runtime environment whilst updates are programmed.
  • the data structures can be automatically integrated based upon data indicating validity of the data structures to provide runtime modification and update of the data processing system that is implemented consistently across all systems of the data processing system.
  • the present invention relates to a meta-model and a methodology to define and configure a dynamic communication platform using real world business data structures, processes, their interactions and integration points with the capability to change without having additional programming effort, with the support for parallel running of different versions of structures and processes simultaneously to make change transitions smooth, while providing its own consistency auditing, transaction logging, multilingualism, and release management using a single, all-in-one meta-model.
  • This meta-model has the capability to hold business content in the form of structures, processes and configurations, allowing community members to share best practices among them.
  • the present invention is generally relate to business processes and communication, more particularly in information technology based solution to create a dynamic communication platform with dynamic data structures, dynamic processes, consistency auditing, continues releases management, multilingualism, system integration support, all working together in a complementary manner in the form of a Business Process Meta-Model providing a way to implement concrete business processes in agile manner.
  • Bundled services provided in today's service includes, hierarchical number of elements, which make the service itself, very complex. Further, the complexity of the service delivery is ever increasing; number of interaction channels between stakeholders in a service delivery environments being a major contribution for this complexity. Even to deliver a single service there is an involvement of multiple stakeholders such as providers, sub-providers and multiple users of each provider and sub-provider.
  • EDB Enterprise Service Bus
  • ESB Enterprise Service Bus
  • ESB was able to integrate the communication of the independent components of heterogynous service management, while keeping decoupled architecture, where a common messaging model was used in this ESB based on a hybrid architecture solution. This enabled the quick integration and reduced the bespoke application development.
  • ESB was able to moderately reduce cost and increase quality, efficiency and governance which were improved compared to Legacy Solutions.
  • the following communication related problems can be identified in the present Service Delivery domain a) Difficulties in handling the complexity of involvement of multiple parties/participant in the service delivery process. (Partners, Providers, Sub-Providers, Principals Etc.) b) Complexity of delivered services. (Services are composition of multiple sub services and bundled services containing service components of varied types). c) Rapidly Changing delivery models. (Services are rapidly changed depending on customer requirements, demand patterns, innovation, strategic requirements, government involvement etc.).
  • the present invention takes a different approach in resolving the Service Delivery Challenge , where, by means of an agile platform - the Community Integration Framework, it facilitates the management of the above complexity, through the deployment of processes, data models and controlled exchanges between organizations. Compared to the aforesaid Legacy and Hybrid ESB solutions, the present invention creates a successful communication model to the community, successfully integrating systems and the people in the community.
  • the present invention would enable organizations to organize and manage their IT interactions and communication within a community of multiple and different partners as part of their business process.
  • the present invention provides governance to control the interactions within the community, thus enabling the organization to act as the central exchange, for the information for all the players within the community.
  • the present invention consists of a configurable toolkit, which reduces the typical application implementation time by introducing ready-made business content of domain specific global best practices, processes, and patterns for common scenarios. Furthermore, the present invention can be quickly and easily fine-tuned to an organizations exact need, which would otherwise take much longer time to build from scratch.
  • the present invention has the capability of executing multiple versions in a similar platform, the seamless service delivery and service transitions are also improved.
  • This enables configuration and structure changes to the communication model to be done quickly in real-time, which improves the awareness in the community, achieved through the central communication model, coding standards, glossaries that are shared in the community, enabling participants of the community to understand business needs in a common language.
  • Methoda-Model In representing Enterprise Business Processes, data and configuration of communication models, it is required that various aspects of these modeling requirements to be done.
  • Business Process Meta-Model (hereafter referred to as "Meta-Model") has basic constructs to facilitate all these requirements.
  • the present Invention has a top level isolation, making it possible to create multiple instances [refer Figure 14] of models. Each model will be self-contained and completely isolated from others. This allows creation of multiple instances within a same organization or domain for varied purposes. (Example would be having Design, Development, Testing/Sandbox and Production instances for said purposes). Data Structures, Process Definitions, Users, Roles, Profiles and all related information will be stored under an instance to make this isolation.
  • Typical Business Structures being multi-level hierarchical, and these hierarchical structures can be represented by nodes in a parent child relationship arrangement.
  • An employee having skills and hobbies can be represented by skill and hobby child nodes, making an employee node as a parent).
  • data structures with any depth and width can be represented.
  • the information about these nodes is stored in a meta-node, which describes the actual node.
  • meta-node contains the information about the node.
  • Figure 16 shows the capability to define multiple child nodes in hierarchical structures using meta-nodes.
  • ID field is use as an auto generated unique key (primary key) for the table to store the elements.
  • ID field could be auto number field or a manually generated unique ID. which act as the primary key for the table.
  • Display Legend specifies the wording / text, which could be used to show the name of the element when rendering data entry screens etc.
  • Display Assistance is an additional field, which can store more 'help information' regarding an entity.
  • Some elements such as meta- structure will only have display legend while some structures such as meta-data will have both.
  • display legend can be used as the caption for the text entry boxes and the display assistance can be used as the additional 'help text' to describe more details on how to enter the data, rules etc. The latter would be in a detailed and descriptive form, in a data entry form arrangement.
  • Sequence Order of a given element in multiple-element arrangement is specified using a filed named "SequenceNb".
  • This filed can contain a number or a textual representation of how the order of the element ought be presented in multiple element scenarios. This is relevant to meta-node and meta-data.
  • the actual data nodes When specified in meta-nodes, the actual data nodes will be arranged in the specified order.
  • the data columns / fields shown in the record will be ordered using this order. The use of this field is only for rendering/ presentation purposes only.
  • meta-nodes act as sub-structural component (Example: correspondent to sub sections of a hard-copy form) which separate functional aspects such as visibility, rendering, reading and writing permissions etc.
  • Rendering engines can read the permission Read/Write/Hidden and render the details appropriately.
  • Figure 6 shows the arrangement of sub-structural components of a typical form structure which the sections' visibility and accessibility can be controlled individual level.
  • Special selected Meta Data can be designated as a "Display Name” field.
  • Data element related to this field is used to show/list when the objects of this meta-node type are listed in a list box or a combo box. For example if there is an Employee object created using Employee meta- node and there are Employee No, Employee Name and Salary meta-data, and one field such as "Employee Name” can be selected/ marked as "Display Name” field. Whenever an employees need to be listed the Employee Name can be used to list them.
  • Type specification of data stored in records can be specified in the meta-data level using specific fields "DataType", “Length”, “FormatSpecification” and "ControlType” attributes.
  • the DataType field can contain the allowed data type for this particular data element. Possible values can be Numeric, Text, Character, Date/Time, Long Integer, Double but not limited to the same. FormatSpecification field could contain the specific formatting rules such as number of decimal places, allowed set of characters and other similar rules depending on the actual implementation required.
  • ControlType field can be used to specify the type of User Interface control to be used when rendering in a data entry forms and user interaction components.
  • Rules can be defined to specify whether null or empty values are allowed in the data entry. This is also stored against meta-data level. "ValueNullYN” and “ValueEmptyYN” fields are respectively used for the purpose.
  • Qualifiers are list of qualities from which one can be assigned to a node to specify a specialty or type. Qualifiers are defined by meta- qualifier and attached to meta-nodes and assigned to nodes when nodes are created.
  • Figure 19 shows the diagram of meta-qualifier and qualifiers connected to a specific meta-node / node arrangement.
  • Figure 20 shows the usage of qualifier to qualify a node by providing specialization.
  • One single meta-node can be marked as; it can have its own hierarchy by making same type child nodes.
  • Figure 21 represents the graphical notation for such arrangement. This allows users to create hierarchical data elements of same type. For example location can contain locations as child nodes in a real-world data modeling requirement. There could be any number of levels (depth) of locations in parent-child type of arrangement. This can be named as a recursively repeating meta-node.
  • a special data field called "HierarchyYN" is used to keep this state whether a node can be recursively repeated or not in a hierarchy.
  • qualifiers for each level can be added using qualifiers for each level.
  • location can be specialized by Country, City, Region, Building qualifiers which will give more specific meaning to each level of the meta-nodes.
  • qualifiers can be arranged in a hierarchical structures to it will govern the hierarchy of recursively repeating meta-node by giving set of hierarchy rules. This is shown in Figure 7.
  • Meta-nodes and nodes can stand independently without being qualified by any.
  • a special FunctionallD field is used for all meta level elements (Instance, Meta-Structure, Meta-Node, Meta-Qualifier and Qualifier) to uniquely identify an element which can be used in all references to those elements.
  • a data field or multiple data fields can be marked using a special attribute called “Contribute To Key” to denote that field will contribute to form a composite key for the data stored under a given meta-node.
  • the marking is done at meta-data level.
  • the functional key is calculated or computed according to the marked data elements which contributes to key. This value can be directly stored in the "Node" object for fast query purposes.
  • the functional key scenarios for data elements are demonstrated in Figure 22.
  • a mechanism can be designed to refer various nodes from other nodes' data elements to store relationships (references). References can be from two basic sources,
  • Referencing other nodes is achieved in data level by defining a link (which is done by providing the key of the referred item) to the item referred. This is achieved by allowing the Meta-Model to have a key to other node in the data record.
  • the meta-data it is possible to define a referred meta-node denoting which type of nodes can be referred. Reference mechanism is illustrated in Figure 4 and Figure 5.
  • a generalize process model can be defined using the same Meta-Model concept. Status, Transaction, Trigger, Business Rule, Condition, Case, Operator, Standard Routine, Standard Routine Parameter, Macro and Macro Parameter objects can be used to define Processes and Workflows.
  • Meta- Model provides a way to design custom data structures which can be used to render graphical user interaction modules to accept such input content / data in the method of form based input screens, it is possible to use meta-node based hierarchical structures to be involved in process and workflow definitions. These act as "Form objects" which is used to record data.
  • Forms being the entry point for process workflows, it is possible to represent an entry point to a process by creating a new "form object" (creating a new form using the meta structure for the form) and entering data to it.
  • a "form object" (which is a node hierarchy with data associated to each node) can associate with a set of all possible “status” values for that particular form. (Status values are associated with a specific meta-node which will be available to assign to a node.) The very first/initial status can be stored in the meta-node itself (start-up status of the form in the workflow sequence).
  • Figure 23 shows the relationship of "Status" with "Meta-Node".
  • the Node Status table contains the various status values which are assigned to a given node at various times in the workflow process.
  • the sequence of statuses which node had been associated is marked with the sequence numbers.
  • Transactions being the entry points of the processes and workflows, transaction can be configured to setup a process/workflow.
  • Transaction entity contains the information about transaction, and can be connected to security / accessibility to control who can access to that particular transaction. Transactions will directly invoke the trigger configured to it.
  • Triggers are of several types (based on Button press, ESB even, Business Event etc.). Once trigger is invoked by the appropriate invoker, the trigger will check the configuration setup for its execution. A single trigger can have multiple business rules to be evaluated and based on the evaluation there can be macros which can be configured to execute. A trigger will set the appropriate status of a node being addressed, when executed. Standard routines can be written to generally invoke event by creating and calling a trigger even within business processes making it possible to install dynamic custom event handlers.
  • Business Rule is a composition of multiple conditions which comprise of multiple cases.
  • Business rule can invoke one or more macros based on the evaluation status of the set of conditions which the rule comprises of.
  • Condition contains a single logical statement (of multiple cases) which needs to be evaluated to satisfy the business rule it belongs to.
  • Conditions can be grouped and combined using logical "AND” and “OR” operations as well as they can have a sequence number which the order of evaluation will take place.
  • Condition contains multiple "cases" which are the smallest logically evaluable segment. Case contains two operands and an operation to evaluate to contribute to the condition it belongs to. Cases also contain the sequence number to determine the order of evaluation.
  • Varl JD which can be set as the first input parameter. In typical cases this could be a parameter of the id of data (concrete instance of meta-data). Var2_ID also could be an id of a reference data or a fixed value. This type is defined in the Var2_Class parameter. (Possible class values for var2 are Data Reference, Fixed Value).
  • Standard Routines are sub-routines which can be developed using a computer programming language of choice. These sub routines can be configured under "Standard Routine" table and their parameters can be configures to "Standard Routine Parameters” table. Standard routine is capable of doing / executing a defined task which it is programmed to do. These standard routines can be developed not limited to but from simply updating a data element of a given node to execute an .EXE file to send an SMS using a gateway to pass some data to a web service. [612-1] The Standard Routine has a parameter called "Executable Component" which specifies what to execute as this standard routine. Value is implementation dependent and can be a specific function or procedure from an executable code library.
  • Macros setup for the business rule can be executed.
  • Macro is an instance of a Standard Routine which is setup for the given business rule.
  • Each macro contains multiple Macro Parameters of concrete values for standard routine parameters. Macros have their own sequence numbering to define the order of execution.
  • Macro Parameters are the concrete values passed for each Macro; intern will call a Standard Routine. Each parameter has "Variable ID”, which is the reference of the id of the data (concrete instance of meta-data). "Variable Class” will specify whether the value is a reference or a fixed literal. (Possible class values for var2 are "Data Reference” or “Fixed Value”). In case of variable class is "Fixed Value", the fixed value is set to the "Variable Value” field.
  • the software tool can be built based on the concepts invented in the Meta-Model by means of a continuous release process to support service continuity and prevent service breakdowns.
  • the Meta-Model supports copying and creation of full or partial data structures, processes and configuration elements by providing a cloning mechanism. This cloning can be used to serve 2 purposes,
  • [708] The functionality of [707] provides the ability to build multiple layers of business content in an evolutionary manner by allowing to have layers such as “Base Layer”, “Community Layer”, “Domain Layer” and “Application Layer” etc. but not limited to.
  • [709] The functionality of [707] also provides the ability to implement various releasing configurations such as “Major”, “Minor” and “Patch”. Also hot patching is capable as it does not require shutting down the system for applying a patch as it supports the continuous application of releases.
  • Figure 9 shows the effect of adding an additional field "Contact Details" to the customer data structure.
  • a new meta-data element is added with a different "Valid From” date. All the data related to the previous structure will exist as they were and when the new data is entered from the effective date of the new data element they will be stored against the newly added meta-data ("Contact Details").
  • Contact Details When the system accesses the old structure only the old data will be used, but when the system is accessing new structure the new data elements also will be retrieved accordingly. (This is as old processes only know old data and new processes know new data.)
  • Figure 10 shows the effect of deleting an existing data field "Contact Details" of the customer data structure.
  • the meta-data element is deleted with setting the "Valid To" date to an appropriate ending date for the meta-structure. All the data related to the previous structure will exist as they were and when the new data records are entered to the structure only the valid data elements will be entered / prompted. (This is as old processes only know old data and new processes know new data.)
  • Meta-Model supports consistency auditing facilities by providing required information.
  • Multilingualism support can be easily implemented using two additional tables to the Meta-Model.
  • the multilingualism feature supports entering literals in multiple language as well as users to view content using their preferred language if available.
  • Meta-data contains a special field "ValueTransation” to specify whether the field is allowed to be translated in multiple languages. Transactions are permitted only if the "ValueTranslation" filed is set to TRUE/YES at meta-data level.
  • Auditing Program can access this information to produce a report of all the history values of each meta-element which are in the inactive state to find the log of change history. [852] As all the data changes are stored in copies of nodes and data, they also will be available to track the history of data changes. A report can be produced to list all the changes done to nodes and their data. This report will provide the required transaction history.
  • Users can have multiple profiles and Profiles will be associated with multiple Users.
  • the objective is to group a set of common types of profiles to share common rights.
  • Roles are used to collect set of permissions for a given scope which role is in control.
  • roles can be assigned to profiles according to the intended function of each profile.
  • a single Role can be assigned to multiple Profiles as well as a single Profile can have multiple Roles to derive its available/allowed Transactions.
  • API Application Programming Interfaces
  • REST Real State Transfer
  • Implementation will provide functionality to push and pull data based on configured setup.
  • Meta-Model being a dynamically configurable system, it is possible to create structures to represent various automation interchange end-points. By providing communication API, it will be able to create a consistent communication models to integrate various participants including external systems and legacy systems in to a common platform.
  • Meta-Model Once the API set is created to access the Meta-Model and its content, it is also possible to generate user interface screens to represent and access the Meta-Model and its data by enabling the communities to access to the communication defined by the model's concrete business content, making it a tool to support community integration.
  • Forms using data structures) such as Incident, Problem Form, Configuration, Issue Tracking, Employee Time Sheet Entry but not limited to, can be created.
  • Process model can be used to implement workflows using the forms created, rules and interfacing contracts can be defined as well as the integration systems.
  • Meta-Model While providing the Form, Workflow, Business Rules functionality, the concrete implementation of a Meta-Model will also provide the continuous releases, multilingualism, consistency auditing and transaction auditing functionality parallel and simultaneous, providing the missing essentials in today's models in one unit.
  • This Business Process Meta-Model was designed to address, but not limited to the community integration, communication and overall service delivery process management related problems and challenges exists in the Service Delivery Domain, which have been set out hereinbefore: a) Difficulties in handling the complexity of involvement of multiple parties/participants in the form of Partners, Providers, Sub-Providers, Principals etc., in the service delivery process;
  • the present Invention represents various communities and their participation by means of flexible Meta-Model based data structures to define such communication models.
  • Meta-Model Using the functions of Meta-Model discussed on item [504] to [521 ] it is possible to create data structures which represent the participants of the community and the relationships of them, in the form of business content.
  • the flexibility of the Meta-Model allows creating required data structures to represent partners, their relationships as well as communication structures and the communication paths using the elements described.
  • Meta-nodes can be used to create elements to represent each partner and meta-node relationships such as "parent child hierarchy” and “references” can be used to define the relationships between them.
  • Forms can be created using required data structures to form communications.
  • the present invention addresses and resolves this by representing and managing the complex service delivery models using this novel Meta-Model.
  • the Business Process Meta-Model provides a way to represent various communities and their participation by means of data structures using the flexible Meta-Model. Also it supports definition of required communication models, and allows the management of complex service delivery models using the said Meta-Model.
  • the Business Process Meta-Model allows designers to change the delivered services rapidly using graphical Meta-Model designing tools, and it also comprises with the ability to implement this in a continuous evolving method.
  • Using the Meta-Model in the present invention it is possible to change the structures without requiring additional programming support or any other additional software writing. This is by providing a way to dynamically create and change data structures for rapidly changing needs. These rapid changes can be applied to a live solution by using continuous release management detailed in paragraph [700] found herewith.
  • the present Invention provides business best practices, in the form of ready-made data structures and processes to solve common business problems. Through this invention this problem is resolved by various functional content being created, using the above core Meta-Model and process model in order to create solutions / business support functions such as problem tracking, issue tracking, release management, change management etc. e) The problems associated with streamlining of interaction of System Component and People Component, which is due to the lack of a defined processes;
  • the Business Process Meta-Model resolves this by having the capability to define processes to streamline communication of the participants of the respective communities. f) Inability to maintain consistent communication between parties, which is the result of lack of governance, management of communication and also the lack of controlled communication;
  • the Business Process Meta-Model provides a mechanism to implement a defined and controlled communication between participants to provide consistent communication model.
  • the present invention uses defined and consistent data structures to represent every entity in the communication process; hence, it is possible to streamline the communication of these participants involved.
  • These processes can be defined to specify the rules, how the communication and interactions are managed within other processes moreover the people and systems, which enables a defined and controlled communication between these participants.
  • the Business Process Meta-Model has the ability to define common standards, glossaries, business structures, terms and share them within the community to interpret concepts same in the form of a common language.
  • the Present Invention provides facilities and infrastructure to connect External/Partner IT systems to participate in the communication by providing a simple and easy integration platform.
  • Meta-Model to facilitate the information interchange.
  • an automated service can be created to use the Meta-Model along with the services layer specified in paragraphs [1000] - [1002] herewith, with the support of an ESB to communicate messages (which contain the required data according to the created structures).
  • the present Invention The Business Process Meta-Model resolves this issue by the management of automated releases and continuous releases of system changes in an un-interruptible manner, by applying changes to systems to align with the rapidly changing business by providing smooth transition cycles.
  • the present invented Meta-Model supports consistently a way to define and manage business processes and their changes effectively, efficiently, with less time and low cost to address this problem.
  • the agility in designing in this present invention allows effective designing of business processes while the Continuous Release Model and Consistency Auditing Model provides an efficient, fool proof deployment of business processes with a guaranteed up time.
  • Through the present invention by elimination of additional programming effort required for changes, can significantly reduce time and cost associated in change implementations.
  • the Present Invention The Business process Meta-Model has the ability to define and handle real world business data structures, processes using a highly dynamic, configurable Meta-Model with minimum engineering effort, consistency and short time span. Furthermore, these processes are developed in an agile manner.
  • a comprehensive solution such as the present Invention makes it a single, integrated, comprehensive solution which can easily implement, adapt and incrementally develop to suit organizations requirements within short time spans.
  • a specific process model extending the Meta-Model in paragraph 1 further comprising, a. Configurable data structures and processes to define business processes, just by using configuration only (without writing any programs, re-compilation or any other programming tasks) while providing, b. The ability to define transactions which are invoked by various triggers, while providing facilities to definition of business rules which are executed based on evaluated criteria, where criteria created by conditions and their cases, while providing, c. The ability to execute defined standard routines with dynamic parameters, where standard routines available for required common activities to be performed.
  • a Meta-Model according to paragraph 1 further comprising the intension to keep the business data structures, processes and rules.
  • a Meta-Model according to paragraph 1 and paragraph 2 further comprising the ability to copy paste of business data structures, processes and rules from one instance to other.
  • a Meta-Model according to paragraph 1 further comprising the capability to hold multiple active versions of the model in the Meta-Model itself in the form of co-existing versions of the same structural element.
  • a Meta-Model according to paragraph 1 further comprising the ability to run multiple release version of the same Meta-Model in a single platform in a harmonized manner without effecting data or functionality of each version.
  • a Meta-Model according to paragraph 1 further comprising a hierarchical model with functional permission on different levels of structures for given roles and profiles, 8.
  • a Meta-Model according to paragraph 1 further comprising auditing and historical change information inside the structure themselves, without the need for additional auditing mechanism to track audit information, while providing,
  • a Meta-Model according to paragraph 1 further comprising the capability to audit consistency of existing data with modifications done to Meta-Model to fix any consistency issues, while providing,
  • a Meta-Model and a process model according to paragraph 1 further comprising the intension to keep ready-made best practices.
  • the design supports capability to carry ready-made business content in the form of processes and structures which can be immediately applied. This provides the best practices, processes as ready-made content to quickly transit and implement for the service delivery bringing quick wins.
  • a Meta-Model and a process model according to paragraph 1 further comprising the intension to supports sharing of existing best practices with interested members in the community. This sharing facility of practices is built in to the Meta-Model and process model with release management to provide the necessary functionality to copy and paste business best practices between interested parties without interrupting services.
  • An integration platform further comprising the facilities to provide un-interrupted operations and running of business processes and structures of the business content using a contentious release management by allowing running multiple versions of same processes or data structures simultaneously making is possible to smoothly transit from one version of the business content to other without service interruptions with zero down time.
  • An integration platform according to the paragraph 4 using the concept presented in claim 1 , further comprising a method of applying changes to the processes and application data structures without program recompilation by using a purely configuration based data modeling.
  • An integration platform according to the paragraph 4 using the concept presented in claim 1 , further comprising a method of applying changes to the processes and application data structures in an incremental manner to provide agile development of business processes while the system is continuously running.
  • a Meta-Model according to the paragraph 1 while providing the above facilities, provide the in-built functions such as management of the Meta-Model itself, controlling the changes to the Meta-Model itself, and availability of audit information and the versioning of structures, processes and data in the Meta-Model itself making a self-tracked Meta- Model not requiring the support functions to be separately included.
  • FIG. 3 Meta-Model Core Tables. This diagram shows the core tables of the Meta-Model.
  • Figure 14 Isolated instances.
  • the Business Process Meta-Model has a top level
  • Figure 15 Specifies the meta-structure arrangement to represent data structures in an instance.
  • Figure 16 Hierarchical Data Structures using Meta Nodes.
  • This Figure shows the capability to define multiple child nodes in hierarchical structures using meta-nodes.
  • Figure 6 The arrangement of sub-structural components of a typical EDI Form (Electronic Data
  • Interface Form which the sections' visibility and accessibility can be controlled at individual level.
  • Figure 17 Meta-data representation of a sample meta-node
  • Figure 18 Meta-node to Node and Meta-Data to Data relationship.
  • Figure 19 Diagram of meta-qualifier and qualifiers connected to a specific meta-node by providing specialization to the data nodes that meta-node represents.
  • Figure 20 Depicts the usage of qualifier to qualify a node by providing specialization.
  • Figure 21 Represents the recursively repeating meta-node arrangement and notation for such arrangement. This allows users to create hierarchical data elements of same type.
  • Figure 7 Usage of Qualifiers to enhance the recursively repeating meta-node's hierarchy.
  • Functionality qualifiers can be arranged in a hierarchical structure which governs the hierarchy of recursively repeating meta-node by giving set of hierarchy rules.
  • Figure 22 Functional Keys scenarios for Node/Data arrangements. In some cases multiple objects can exists when there is no unique key constraint specified. The functional key scenarios for data elements are demonstrated here.
  • Figure 4 Definition of References. In the meta-data it is possible to define a referred meta-node denoting which type of nodes can be referred. Reference mechanism is illustrated in Figure 4.
  • Figure 5 Implementation of References.
  • Figure 8 Process Model - Part 1 - Meta-Nodes, Nodes and Associated Status.
  • Figure 23 Process Model - Part 2 - Triggers, Transactions and Business Rules.
  • This figure shows the effect of adding an additional field "Contact Details" to the customer data structure.
  • a new meta-data element is added with a different "Valid From” date.
  • Figure 10 Release management - Delete existing meta-data elements.
  • Figure 23 Release management - Process Versions accessing Data Structure Versions.
  • FIG 11 Multilingualism Model. According to the arrangement shown in this Figure, additional tables are introduced, namely "Language” and “Translations"
  • FIG. 12 Identity and Access Management Model - Accessibility to Meta-Model elements is done using Identity and Access Management Model.
  • the table structure is shown in this figure.

Abstract

A method of processing data, the method comprising: receiving data to be processed; receiving a model representing the data and relationships between the data, the model defining at least one rule associated with data represented by the model; and processing the received data based upon the at least one rule defined by the model.

Description

DATA PROCESSING
The present invention relates to methods and apparatus for data processing.
The use of computers to process data is now ubiquitous. Systems for processing data are increasingly able to handle increasingly complex data processing tasks, however the design of the systems to handle data processing are also themselves increasingly complex. Managing systems for processing data therefore presents various challenges.
Increasing complexity of data processing systems means increasing challenges in manage the data processing systems and the data that is processed. For example, data processing systems may require updates to maintain or improve the processing of data. Such updates may for example add additional functionality where desired, may improve existing functionality, or may provide for integration of separate systems into the data processing system.
Data processing systems may for example comprise functionality that is provided and managed by a single entity or may include functionality provided by a third party service provider, which may for example be located at a separate physical location. Integration of different systems presents various challenges, for example relating to consistency of how data is communicated between different systems.
There could be many things that happen to the data processing environment in an enterprise IT ecosystem such as changes to data models, changes to business processes, configuration of integrated systems and other such changes. It has been a real challenge for today's CIOs (Chief Information Officers), IT Managers and other IT professionals to maintain systems such that the systems are constantly and consistently aligned to continuously changing business processes. It is the IT Manager's responsibility to keep such changes constantly applied to the IT infrastructure and make sure the business processes are up-to-date, whilst ensuring that the transition between iterations of the system is smooth to avoid any issues of business continuity.
One proposed solution for integrating computing infrastructure is the Enterprise Service Bus (ESB) architecture in which a common messaging model is provided to allow systems to be integrated. In the ESB architecture system integration is provided by way of a bus that receives data from systems in the form of messages, processes the messages and outputs messages to other systems. Whilst the ESB allows systems to communicate with one another, the ESB is complex to maintain and configure and does not provide sufficient flexibility for data processing systems to be easily updated. There therefore remains a need for improvements in data processing.
It is an object of the invention to provide improvements in methods and apparatus for data processing.
According to a first aspect of the invention there is provided a method of processing data, the method comprising: receiving data to be processed; receiving a model representing the data and relationships between the data, the model defining at least one rule associated with data represented by the model; and processing the received data based upon the at least one rule defined by the model.
By processing data based upon a model defining at least one rule associated with data represented by the model, the processing can be controlled by the model. Controlling processing of data based upon a model in this way allows aspects of data processing to be designed in advance and controlled by the model. The model allows for example multiple systems to process the data in a consistent way. Additionally the model can allow the data processing to be modified in a straightforward way.
The model may represent a plurality of instances of the data, each instance of the data having associated validity data. Processing the data based upon the model may comprise determining an instance of the data to be processed based upon the validity data associated with each instance of the data. For example, each instance of the data may be processed to select a current instance of the data based upon the validity data.
The validity data may comprise a time, wherein determining an instance of the data to be processed based upon the validity data associated with each instance of the data comprises determining a relationship between the time associated with instances of the data and a predetermined time. The model may model a plurality of hierarchical data structures and relationships between the hierarchical data structures. The model therefore allows objects and relationships of a system to be pre-defined and processing to be controlled based upon the objects and relationships.
The at least one rule may define properties of the data. For example, the at least one rule may define the form of the data and required fields in the data. The model may for example allow a data structure to be instantiated based upon the model.
The at least one rule may define properties associated with processing of the data. For example, the at least one rule may define access permissions for the data and processing that is permitted to be carried out on the data.
The model may define a plurality of groups, wherein each group is associated with one or more data items represented by the model. Each group may define a logical isolation from other ones of the plurality of groups. The groups may for example define collections of data items and/or objects associated with the data processing system that have a functional or logical relationship. Grouping data items in this way allows modifications to be made to a group in a straightforward way.
The model may comprise a plurality of entities, the plurality of entities together representing the data and relationships between the data. The model may comprise a plurality of instances of an entity, each instance of the entity comprising data defining a version associated with the entity. A first plurality of entities may represent data and relationships between the data and a second plurality of entities may define rules associated with the data and processing of the data. Each of the second plurality of entities may have a relationship with at least one entity of the first plurality of entities, each relationship providing an association between a first entity and a second entity, wherein data represented by a first entity is processed based upon rules defined by second entities having a relationship with the first entity. The first plurality of entities may for example correspond to objects of a system and the second plurality of entities may provide meta data associated with the objects of the system. The meta data may provide rules associated with the data for processing.
Each entity may comprise data associated with creation and/or modification of the entity. The data associated with creation and/or modification of the entity may comprise data associated with a property selected from the group consisting of: a date the entity is created, a user that created the entity, a date the entity was last updated and a date that the entity was last used. By including data in the model associated with creation and/or modification of entities of the model, data auditing can be automatically carried out using the model.
The method may further comprise receiving a further model, the further model representing data processing functionality and rules associated with the data processing functionality, wherein processing the received data comprises: receiving data indicating data processing functionality to be performed on the received data; and processing the received data based upon the received data indicating data processing functionality and based upon rules associated with the data processing functionality. The further model may therefore provide further rules associated with the system and provide further control and integration of data processing systems.
The method may further comprise receiving updated model data, and modifying the model representing the data and relationships between the data based upon the updated model data, wherein modifying the model comprises: storing an instance of the model corresponding to the received model; and generating an instance of the model based upon the received model and the updated model data. That is, instances of the model may be stored whenever the model is modified. In this way the model instances allow data auditing of the model and the facility to undo any changes in a straightforward way.
The model may model a plurality of qualifiers associated with data represented by the model. The qualifiers may for example further specify properties of the data.
According to a second aspect of the invention there is provided a method of processing data at a plurality of data processing systems, the method comprising: processing first data at a first data processing system according to any preceding claim to generate first processed data; and processing the generated first processed data at a second data processing system according to any preceding claim.
The second aspect of the invention therefore uses the model of the first aspect of the invention to provide consistency between different systems and allows multiple systems to be integrated and to process the same data. Improved integration of the systems is therefore provided. Aspects of the invention can be implemented in any convenient form. For example computer programs may be provided to carry out the methods described herein. Such computer programs may be carried on appropriate computer readable media which term includes appropriate non- transient tangible storage devices (e.g. discs). Aspects of the invention can also be implemented by way of appropriately programmed computers and other apparatus. The invention may for example be carried out using either a single computer system/server or multiple computer systems/servers and other tools can be used in the form of Database Servers, ESBs, Application Servers and Web Servers etc. with the support of programs written based on the described model/methods.
Embodiments of the invention will now be described, by way of example, with reference to the accompanying drawings, in which:
Figure 1 is a schematic illustration of an example data processing system;
Figure 2 is a schematic illustration of a computer of the plurality of systems of Figure 1 ;
Figure 3 is a schematic illustration of a meta-model showing entities of the model as well as relationships between the entities;
Figure 4 is a schematic illustration of references defined between entities within the data model;
Figure 5 is a schematic illustration of use of references between entities of the meta-model at run-time;
Figure 6 is a schematic illustration of a structure for display of components with multiple levels;
Figure 7 is a schematic illustration of a hierarchical structure of qualifiers in the meta-model of Figure 3;
Figure 8 is a schematic illustration of the process model in further details; Figures 9a and 9b schematically illustrate addition of a meta-data element to a meta-node of the model of Figure 3;
Figure 10 is a schematic illustration of deletion of meta-data from a meta-node of the model of Figure 3;
Figure 11 is a schematic illustration of translation of data using the models described herein; Figure 12 is a schematic illustration of an accessibility model;
Figure 13 is a schematic illustration of a data processing system using the meta-model;
Figure 14 is a schematic illustration of Isolated instances. The Business Process Meta-Model has a top level isolation making it possible to create multiple, independent, self-contained, self- functioning instances;
Figure 15 specifies a meta-structure arrangement to represent data structures in an instance;
Figure 16 is a schematic illustration of Hierarchical Data Structures using Meta Nodes;
Figure 17 is a schematic illustration of Meta-data representation of a sample meta-node;
Figure 18 shows Meta-node to Node and Meta-Data to Data relationship;
Figure 19 shows meta-qualifier and qualifiers connected to a specific meta-node by providing specialization to the data nodes that meta-node represents;
Figure 20 shows the usage of qualifier to qualify a node by providing specialization;
Figure 21 shows the recursively repeating meta-node arrangement and notation for such
arrangement. This allows users to create hierarchical data elements of same type; Figure 22 shows Functional Keys scenarios for Node/Data arrangements. In some cases
multiple objects can exists when there is no unique key constraint specified. The functional key scenarios for data elements are demonstrated here;
Figure 23 shows part of the Process Model of Figure 8;
Figure 24 shows release management - Process Versions accessing Data Structure Versions. Referring first to Figure 1 , an internal data processing system 1 comprises one or more systems 2, 3, 4, each system of the one or more systems 2, 3, 4 providing computing infrastructure comprising hardware and/or software. Each of the plurality of systems may for example be associated with a respective business function and may provide computing functionality to the respective business function including data storage, processing and management. As shown in Figure 1 , the one or more systems 2, 3, 4 may be arranged to communicate with other ones of the one or more systems such that data may be processed using the plurality of systems. It will be appreciated that additional systems and communication between systems may be provided.
Additional computing infrastructure comprising hardware and/or software may additionally be provided by third party systems 5, 6. The third party systems may for example comprise computing services that are outsourced to third party providers such as data storage, management and processing.
It will be appreciated that the systems 2, 3, 4, 5, 6 may comprise hardware, software or both. Figure 2 shows a computer 7 on which systems 2, 3, 4, 5, 6 of Figure 1 may operate in further detail. It will be appreciated that each of the computers used in the data processing system of Figure 1 will typically have the same general structure. It can be seen that the computer comprises a CPU 7a which is configured to read and execute instructions stored in a volatile memory 7b which takes the form of a random access memory. The volatile memory 7b stores instructions for execution by the CPU 7a and data used by those instructions. For example, in use, data that is communicated between the systems of Figure 1 may be stored in the volatile memory.
The computer 7 further comprises non-volatile storage in the form of a hard disc drive 5c. Data such as data communicated between the systems 2, 3, 4, 5, 6 may be stored on hard disc drive 7c. The computer 7 further comprises an I/O interface 7d to which are connected peripheral devices used in connection with the computer 7. More particularly, a display 7e is configured so as to display output from the computer 7. Input devices are also connected to the I/O interface 7d. Such input devices may include a keyboard 7f and a mouse 7g which allow user interaction with the computer 7. It will be appreciated that the computer may have other input interfaces. A network interface 7h allows the computer 7 to be connected to an appropriate communications network so as to receive and transmit data from and to other computers of the systems 2, 3, 4, 5, 6. The CPU 7a, volatile memory 7b, hard disc drive 7c, I/O interface 7d, and network interface 7h, are connected together by a bus 7i.
Figure 3 shows a data model of entities associated with data of a data processing system such as the data processing system of Figure 1 . The model shown in Figure 3 should be understood as illustrative only, and it will be appreciated that entities may be modelled in any convenient way.
As shown in Figure 3, the model comprises an instance component 301 that provides a top level isolation for instances of the model and that allows multiple instances of the model to be created. The model of Figure 3 further comprises tables defining one or more hierarchical data structures.
In general terms, the model includes node and data entities that correspond to run-time data related to objects (such as Employee record, Skill record, Incident Record, Insurance Claim Request Record etc.), and meta entities that provide a definition of associated objects of the data processing system.
In the illustration shown in Figure 3, each hierarchical data structure comprises a meta-structure 302 and one or more nodes 303. Each of the one or more nodes 303 corresponds to an entity that is desirable to be represented in the model of the data processing system. For example, nodes 303 may be associated with employees, skills and hobbies in a data processing system that includes data associated with a business and that models employees of that business. It will of course be appreciated that nodes 303 may represent any data that is desirable to be processed. Nodes 303 may be defined so as to define a multi-level hierarchical data structure. For example, as noted above, nodes may be associated with employees, skills and hobbies, with an employee node having skill and hobby child nodes.
The meta structure 302 has a one to many relationship with nodes 303 and provides a grouping of nodes 303 in the data processing system to provide logical separation of entities represented by the nodes 303. For example, each meta-structure may be associated with a plurality of associated entities of the data processing system. The logical separation of entities may for example be based upon behaviour and/or purpose of the entities of the data processing system. Each node 303 has a one to many relationship with one or more data structures 304 that each stores data associated with the node 303. Each data structure 304 has a relationship with a metadata structure 305 that provides rules associated with related data structures 304.
The rules provided by metadata structures 305 may for example define rules associated with the data such as data type and format rules, as well as default values. The metadata structures 305 may additionally or alternatively define rules associated with representation of data associated with a data structure 304 such as an order in which data is displayed by software arranged to process data in the data processing system.
Each metadata structure 305 data structures 304 may specify permitted references between data and nodes as illustrated in Figures 4 and 5. For example an insurance claim record may have a data field called customer which refers to an actual customer who is doing the claim. Both the insurance claim record and customer record are business objects/entities one referring to the other. The metadata structure may therefore define a permitted reference between the customer field of a node and a customer node of the meta-model.
Each node 303 has a relationship with a meta node 306, which stores information associated with related nodes 303 defining how associated nodes are created at run-time and how the data associated with the nodes may be processed. Each meta node 306 may for example store data indicating a maximum and minimum number of instances of an entity represented by a node 303 that may be created. The meta node 306 may therefore be used during the processing of data using the data processing system to ensure that a maximum number of instances of an entity has not been exceeded before a further instance of an entity is created. The meta node therefore provides a design for the data within the system that may be defined during an initial design phase.
The meta node 306 may additionally or alternatively be used in the display and processing of data associated with nodes in a similar manner to the metadata structure 305. For example, the meta node 306 may specify permissions for functions such as visibility, rendering, reading and writing of data associated with a node 303 that has a relationship with a meta node 306. The meta node 306 may additionally or alternatively specify a structure for display of components, for example as illustrated in Figure 6. For example an insurance claim form having 3 sections, a-Requester Information, b-Approver Information, c-Accounts Managers section, may be created using respective meta-nodes based upon information defined in a meta-node. The meta-node may for example define permissions for different users of the system to control visibility, editing etc. When rendering the claim form to the end user, these sections can be enabled, disabled or hidden based on a user's role and permission for the role as described below.
Each meta node 306 has a relationship with one or more metadata structures 305 such that a meta node 306 may specify rules associated with metadata structures. The meta node defines characteristics and behaviour of a business object, whereas the metadata structure defines individual data elements or fields in the business object.
Each node 303 may additionally have a relationship with a qualifier 307 that associates one or more properties or types with a node 303. Each qualifier 307 has a relationship with a meta qualifier 308 that provides data associated with each associated qualifier 307 in a similar manner to meta nodes 306 and metadata structures 305. Qualifiers 307 can be defined in a hierarchical relationship in a similar manner to nodes, as illustrated in Figure 7.
Each meta qualifier 308 has a relationship with a meta node 306 that may define properties of the meta qualifier 308. Each meta qualifier provides a grouping of qualifiers available for an associated meta node and defines properties and behaviour of the group of qualifiers. When a node is created at run time, a user can select a qualifier and associate the selected qualifier with a node to qualify the node. For example, when creating an employee John it is possible to select a qualifier for the employee such as a qualifier of three qualifiers "Internal", "External" or "Contracted". These qualifiers are grouped and connected to the Employee meta node using the appropriate meta qualifier.
Each element of the model of Figure 3 has a key gid that uniquely identifies the associated element of the model and allows the data processing system to automatically identify entities of the model that are associated with run-time data.
Additionally, each of the meta structure, meta node, meta qualifier and qualifier elements include a FunctionallD field that identifies each of the meta elements of the model and that can be used during design and configuration of the model, for example to provide references between entities of the model. A special attribute Contribute To Key may be used to indicate entities that are used in the generation of the FunctionallD field for a meta structure. For example, an Employee object may be created by including an employee meta-node in the model and creating data element definitions or fields such as PartnerlD, EmployeelD, Employee Name and Age etc. using meta-data entities in the model. A key field may be used to identify each instance of the employee entity based upon a composite key of PartnerlD+EmployeelD data fields. The PartnerlD and Employee ID fields may therefore be marked as "ContributeToKey" such that the fields are used to calculate an identifier key for the data records. For example, an employee Bob may have FunctionallD 1 -1 while John may have FunctionallD 2-1 which means both of them have the same EmployeelD 1 but two different partners such that a unique key is generated for each employee using a composition of keys. Data auditing values are defined by the model such that any data as well as any rules that are defined relating to the data that is created or processed based upon the model can be tracked. In particular, each entity of the model specifies "date created", "created by", "last update date" and "last used" fields that specify data associated with creation and update of entities of the model. The data auditing values can therefore be used to determine when each entity of the model was created and/or updated and also who created and/or updated the entity.
The model additionally includes fields indicating validity of all entities of the model. The fields indicating validity of entities of the model is based upon "valid from" and "valid to" fields defined within each entity of the model. The fields indicating validity of entities of the model may be processed to determine whether an entity is currently valid. In this way, multiple instances of each model entity and also multiple instances of data associated each model entity may be created and stored, and the data processing system can determine which entity and/or instance of an entity should be used as the basis for data processing using the fields. By storing multiple instances of model entities and data associated with model entities in this way, tracking of changes is permitted. Additionally, instances of data and rules can be created within a data processing system and implemented automatically as desired, thereby allowing improved continuity of data processing. That is, by allowing multiple instances to be created and a run time instance to be changed for a new instance by change of a single data field, updates to the data and rules used in the processing of data can be implemented in a straightforward and real time manner.
At run-time an instance of the model of Figure 3 is created based upon the rules specified in the meta-elements of the model. For example, an instance of the model of Figure 3 may be instantiated in a database. The node and data entities of the model store data that is processed in a data processing system and the data processing system may process the data in accordance with rules specified in the meta entities of Figure 3.
It will be appreciated that multiple instances of an entity of the model may be created, as described above. During processing of data of the database the model may be used to determine which instance of an entity of the model is the currently valid instance and processing may be performed accordingly. For example, a plurality of instances of an employee node may be created that corresponds to a particular employee and data within the model allows selection of the appropriate instance of the employee node for the employee.
The meta structures of the model of Figure 3 may additionally be used to define how data is displayed in a data processing system, and how data that is input by a user is processed in response to the user input. For example, the meta Structure, meta node and meta data along with meta qualifiers and qualifiers can be used to render data in a data input screen. When this information is used to dynamically render the data entry screen using appropriate software/programming tools, users can navigate to such screens and enter data which the data will be stored in the form of Node 303 and Data 304 records, possibly based upon appropriate processing of the data.
Figure 8 shows a process model of entities associated with a data processing system such as the data processing system of Figure 1 . Whilst the model of Figure 3 generally represents data of a data processing system, the model of Figure 8 shows relationships between data processing operations using the model of Figure 3. In particular, it can be seen that node 303 and meta node 306 of Figure 3 is shown in Figure 8.
Each node 303 has a relationship with a node status 801 that stores a status 802 for a node. The node status 801 allows statuses of nodes to be determined and also allows validity of a node to be modified based upon fields of the node status 801 , for example indicating an elapsed time period since a node was set to a status and maximum duration fields defining a maximum time that a status is valid for. Where it is determined that a status is no longer valid an alternate path trigger may be executed modify processing associated with a node in the data processing system. Each status 802 has a relationship with meta node 306 associated with node 303 such that the meta node 306 defines possible statuses for node 303. For example, an insurance claim request record associated with a meta-node, may have possible statuses associated with the meta node of Requested, Pending for Evaluation, Approved, Completed Payment etc. When data records for insurance claims are created at run-time defined by nodes, those nodes can have an associated status selected from the possible statuses associated with the meta-node. Each time a status is changed the change is stored in node status 801 . The record with the largest SequenceNumber is the current status for that node representing the current status of the insurance claim.
Each status is associated with a trigger 803. Each trigger 803 may be associated with an event such as user input, or an event associated with the data processing system. Each trigger 803 has a relationship with one or more business rules 804 which define rules associated with a trigger. When an event associated with a trigger 803 is determined to have occurred in the data processing system, each business rule 804 associated with the trigger is processed. Whenever a status associated with a node is modified the data processing system will processes each business rule associated with the new/selected status. Each trigger 803 may additionally have a relationship with one or more transactions 803a that may define security and/or permissions associated with a trigger 803.
Each business rule 804 may have a relationship with one or more conditions 805 that each comprise a logical condition that is required to be satisfied for a business rule 804 to be executed. Conditions 805 may in turn be associated with cases 806 which provide further subdivision of conditions into logical conditions. For example each condition 805 may be based upon a plurality of cases 806 which are the smallest logically evaluable segment. A case may for example contain two operands and an operation to evaluate to contribute to the condition it belongs to. Cases 806 may have a sequence number field that may be used to determine an order of evaluation of cases. Each case has 806 may for example include Varl JD and Var2_ID fields which can be set as a first input parameter, which may for example be associated with a parameter of data 304 of the model of Figure 2 or may be fixed values. Each case 806 may be associated with an operator 807 defining a logical operator associated with the case 806.
If a business rule 804 is determined to be satisfied, for example based upon the conditions 805, cases 806 and operators 807 associated with the business rule 804, then one or more operations may be performed based upon one or more macros 808 associated with the business rule 804. Each macro 808 has a relationship with a standard routine 809 and a relationship with one or more macro parameters 810.
Each standard routine 809 is associated with an operation that may be executed by the data processing system and defines rules associated with execution of the operation and each macro parameter 810 defines rules associated with values for each macro 808. Each macro parameter
810 has a Variable ID field that may be used to define a relationship between the macro parameter and an instance of a data entity 304 of Figure 3 that is used in the execution of an operation associated with the macro. A variable class field allows the macro parameter 810 to specify whether a value associated with the macro parameter 810 is a reference to an instance of a data entity or a fixed literal. Where a variable class is a fixed value, the fixed value may be provided by a variable value field of the macro parameter entity 810. A standard routine entity
81 1 allows properties of data to be defined for standard routines that is used in each instance of a standard routine.
Each entity of the process model of Figure 8 may additionally include data auditing fields and fields associated with validity of entities of the process model in a corresponding manner to that described above with reference to Figure 3 to provide data auditing of the process model and to allow modifications of the model to be made and tracked as described above.
The process model of Figure 8 therefore allows processing performed by the data processing system to be carried out in accordance with specified rules defined within the process model.
The model may be used to provide sharing of best practices between data processing systems. For example, it will be appreciated from the above that the models of Figure 3 and Figure 8 define rules relating to how data is defined and how the data should be processed. The models may therefore be used to instantiate data in a different but related data processing system based upon a configured data model such as the data model of Figure 3 and processing of the data may be controlled based upon a configured process model such as the process model of Figure 8.
It will be appreciated that by providing control of validity of entities within the data and process models, particular functionality can be activated or deactivated in different systems in a straightforward way and functionality can also be added in a straightforward way by implementing new rules associated with data.
Figures 9a and 9b show addition of a field "Contact Details" to a data structure. Figure 9a shows a customer meta-node 306of Figure 3 before addition of a field and Figure 9b shows a corresponding customer meta-node after addition of the field. In order to add a field to a meta- node a new instance of a meta data structure associated with the meta-data that has a relationship with the meta-node is created in the model that is identical to the previous instance of the meta data structure but that specifies that data associated with the meta node has the additional required field and that has an effective date that specifies when the new field becomes part of the data. When the data processing system processes data the data processing system checks the validity of the corresponding meta-data entity in the model and determines from the model that after 20/1/201 1 the data includes the new field "Contact Details". Processing is then performed based upon the rules associated with the new instance of the customer meta-node defined in the model. All the meta-data related to the previous meta- node will exist in the model and in the database as they were. Processes associated with the meta-data may additionally be updated such that the new data fields are processed, for example by modifying associated processes in the process model.
It will be understood from the above that objects of the data processing system are defined using "Meta Node", "Meta Data", "Meta Qualifier" etc. Process elements such as "Task", "Business Rule", "Condition", "Case", "Macro" etc. define the processes of the data processing system. In contrast "Node" and "Data" entities are run-time elements which will be created and used when the actual data records are created based on the defined meta-elements.
Figure 10 shows deletion of an address field associated with a customer node. In order to effect deletion of a field from data a new instance of the meta data structure associated with the data is generated with the desired modification and with dates specifying when the modification becomes effective. The previous instance of the meta data structure is also updated to indicate when the data structure ceases to be effective. After the effective date the data processing system is able to determine from the model the modified fields of the data. Editing of a meta data structure may be carried out by creating a new entity in the model and marking the previous entity as deleted. The data model may be used to provide data consistency within a run-time data structure. In particular, the data model specifies properties of data that are required in the run-time data structure and data can therefore be audited based upon the rules specified in the data model. For example, each instance of a table of a database may be checked against corresponding entities of the data model and the associated rules defined in the data model. Any inconsistencies that are identified can be corrected based upon the rules specified in the data model, for example using default values that may be specified in the data model. The meta entities of the model allow changes to be made to the definitions of the data and any changes may be checked against existing data. The meta entities may be used to modify existing data where inconsistencies are identified to ensure that the data processing system may continue to process data in the system.
All changes may be made to instances of the data that are not currently being processed by the data processing system based upon the models by specifying a future validity date for the corresponding entities in the model such that modifications can be made without affecting the running of the data processing system, and the changes can be caused to be automatically implemented at the appropriate time based upon the validity criterion discussed above.
The use of models as described above supports various features additional to those described above, that will now be described.
The models described above allow support for multilingualism to be provided by the addition of two additional tables to the data model, as illustrated in Figure 1 in which additional language and translations entities are provided. As shown in Figure 1 1 , the language entity provides Language ID and Language Name fields and the translations entity provides Language ID, Resource Type (0=System Literals, 1 =lnstance, 2=Meta Structure, 3=Meta Node, 4=Node, 5=Meta Qualifier, 6=Qualifier, 7=Meta Data, 8=Data, 9=Transaction, 10=Trigger, 1 1 =Business Rule, 12=Condition, 13=Case), Resource ID, Translation , Translation2 fields.
The language and translations entities define relationships between data and languages, and allows data to be provided to a user in a selected language automatically based upon the model. In particular, the translation entity stores identifiers and types of data, together with translations for the data that may be retrieved and provided as output to a user. A user's preferred language may for example be stored in a user entity described below. The meta data structure may store data indicating whether a particular may be translated, and the languages that may be used.
Auditing may be performed on the data by the data processing system because all instances of the model are stored and may be accessed. As all the data changes are stored in copies of nodes and data, they also will be available to track the history of data changes. A report can be produced to list all the changes performed on nodes and data together with information such as Created By, Created Date, Updated By, Last Updated Date etc. which are default fields for any entity of the model including meta elements such as meta-node, meta-data etc. and process elements such as task, business rule, macro etc.
A further accessibility model may be provided that provides data associated with users of the system and that defines permissions for users to access the data of the data model and processes of the process model. An example accessibility model is shown in Figure 12.
A user entity has a relationship with one or more User Profile entities such that a single user may have more than one associated user profile. Each user profile is associated with at least one role. The relationships between users and profiles allows profiles to be grouped according to permissions available to the profiles based upon roles associated with the profiles. The permissions may be used to determine whether a user associated with the data processing system is permitted to carry out a particular transaction, which is associated with a role.
Transactions of the accessibility model correspond to transactions 803a of the process model of Figure 8 such that the accessibility model in combination with the process model determines what operations are permitted to be carried out on the data by a particular user of the data processing system.
A set of Application Programming Interfaces (APIs), for example a REST (Representational State Transfer) based API set, may be defined to provide the functionality of a data processing system based upon a run-time data structure. Such an API can be provided to enable communication between a run-time data structure and processes based upon the data and process models and external services. Such an API may provide functionality for obtaining data from the run-time data structure and for writing data to the run-time data structure ensuring that the rules defined by the models described above are obeyed. Using an API in this way allows integration of data processing systems, including external systems and legacy systems, in to a common platform that is controlled using the data and process models.
User interface screens may additionally be provided to allow access to the run-time data. The user interfaces may for example be provided by way of forms (using data structures) such as Incident, Problem Form, Configuration, Issue Tracking, Employee Time Sheet Entry. Display of data within forms may be controlled based upon the data and process models as described above. The process model may be used to control workflows within the data such as controlling forms that are displayed in response to triggers as described above, with display being controlled within the framework provided by the data and process models.
Referring to Figure 13, a schematic illustration of a generation and use of a data processing system using the models described above.
The system includes three general functional components with some overlap between the functional components as illustrated. A design function 1301 provides an environment for design of the models for use in a specific data processing system. For example the design function 1301 provides functionality allowing the design of instances of the models described above that have specific data and rules for a particular data processing system.
Design function 1301 may be implemented using appropriate software/hardware/tools to create and maintain the meta-model (meta level elements) to create business structures and processes as well as configuration of the other elements such as ESB and APIs etc. Data store functionality 1302 stores data that is to be processed together with the models described above. As described above data stored in the database has a relationship with the entities of the models and the data is processed based upon the model and the relationship between the data and the entities of the model. For example, each instance of a node and data structure entity of the model may have a corresponding data item in the data to be processed and the relationship between the node and data structure and the data is known.
The data store functionality may be used to store all the information required for the data processing system to function in the form of a database specified in figures 3, 8, 1 1 and 12 in the form of a relational database. Any suitable database technology can be used to generate the data store using the meta-model.
A runtime environment 1303 provides an environment in which data of the data store functionality 1302 may be processed based upon the models and additionally provides functionality to allow external systems 1304 to participate in the data processing. The runtime environment 1303 may for example incorporate an enterprise service bus to provide communication between systems that participate in the data processing.
The runtime environment 1303 executes processes, business rules, provides user interface, provide connectivity to other external system etc. This system can be created using appropriate software/hardware/tools such as programmed modules, ESBs, Web Servers, Application Servers etc.
The data store functionality includes a data access layer 1305 that provides an interface between the stored data and the design functionality and runtime environment. The data access layer may for example provide one or more APIs for communication between the stored data and data processing systems. The data to be processed and data defining the models may be stored for example in one or more databases 1306.
The models described above may therefore provide a consistent framework for integrating data processing across multiple systems, which systems may be purpose designed or may be existing systems that interface with the runtime environment 1303 to provide functionality to the data processing system. By basing data processing on models that define rules for how the data may be processed, the systems are able to be integrated in a straightforward way.
Additionally, because the models allow multiple instances of data structures to be created and to exist simultaneously, updating of the data processing system may be provided in a straightforward and consistent way without the need to stop processing within the runtime environment whilst updates are programmed. In particular, by providing multiple instances of data structures the data structures can be automatically integrated based upon data indicating validity of the data structures to provide runtime modification and update of the data processing system that is implemented consistently across all systems of the data processing system. The following provides further information relating to some embodiments of the invention. This description should be considered entirely separately to that set out above, as well as in combination therewith.
METHOD AND SYSTEM FOR INTEGRATING COMMUNICATION BETWEEN BUSINESS PARTICIPANTS AND SYSTEMS
ABSTRACT
The present invention relates to a meta-model and a methodology to define and configure a dynamic communication platform using real world business data structures, processes, their interactions and integration points with the capability to change without having additional programming effort, with the support for parallel running of different versions of structures and processes simultaneously to make change transitions smooth, while providing its own consistency auditing, transaction logging, multilingualism, and release management using a single, all-in-one meta-model. This meta-model has the capability to hold business content in the form of structures, processes and configurations, allowing community members to share best practices among them.
DESCRIPTION
TECHNICAL FIELD/FIELD OF INVENTION
The present invention is generally relate to business processes and communication, more particularly in information technology based solution to create a dynamic communication platform with dynamic data structures, dynamic processes, consistency auditing, continues releases management, multilingualism, system integration support, all working together in a complementary manner in the form of a Business Process Meta-Model providing a way to implement concrete business processes in agile manner.
BACKGROUND OF THE INVENTION
In the present day, organizations face major difficulties in managing their day-to-day IT business requirements due to the inability to manage the integrations between service providers and different IT-systems, both in-house and with external organizations due to interalia:
The involvement of Multiple Stakeholders and Multiple Users.
Bundled services provided in today's service includes, hierarchical number of elements, which make the service itself, very complex. Further, the complexity of the service delivery is ever increasing; number of interaction channels between stakeholders in a service delivery environments being a major contribution for this complexity. Even to deliver a single service there is an involvement of multiple stakeholders such as providers, sub-providers and multiple users of each provider and sub-provider.
The frequency of the change in the requirements, hence the testing of the services across the delivery.
In order to satisfy the demand in the competitive market, competitive services are released to the market very frequently. Management of the bundled services to be launched and the requirement to be released quickly to the market is a rigorous task in the present service delivery challenge. Even deployed services require rapid and frequent changes making the service management difficult. Within the service delivery process, changes, testing, implementation/ changeover are major challenges in a seamless transition of service delivery.
Multi-language support
As a result of the globalization, intercommunications of stakeholders spread in different regions, countries require the communications using their native languages.
Disparate Coding languages/standards
Unavailability of common coding structures and glossaries between participants in service delivery community creates a disconnection in the community.
Multiple Delivery Schedules.
Multiple schedules of the stakeholders are also part of the service management challenge.
Therefore organizations need to optimize the quality of their IT requirements by creating the right environment, tools and processes that help them to achieve their strategic objectives quickly. In last twenty years enterprises have trying to resolve the Service Delivery Challenge through IT tools his involved the centrally defined processes implemented using a central engine .for this purpose bespoke legacy applications were developed with a high investment to provide the integrated solution . Every single stakeholder in the community should adhere to the centrally defined processes. The Leader of the community shares its solution by giving access to its Legacy IT infrastructure to the other members.
• Main benefits:
Harmonized processes, coding system, business rules to run the Community are de facto forced to all members.
• Main constraints:
Other Partners have to run their own solution independently, meaning double data entry and potential integrity issues, double culture and expertise
Community platform is perceived as a constraint, with major business benefits for its owner.
• Comments:
Direct data interchanges between IT systems remain the exception, managed through dedicated and unstable developments that apply for big data volume only.
• Drawback of the legacy solutions are:
Deployment cycle is very high. These solutions cannot be reconfigured/ modified quickly and easily.
Initial development required specific skills and high investment.
Unable to response the dynamic changes and transition cycle is very high.
Enterprise Service Bus (ESB); a Hybrid Architecture Solution.
The said problems were attempted to be resolved in the beginning of 2000's by solutions leveraging in the form of an Enterprise Service Bus (ESB), which enabled better communication between the stakeholders. ESB was able to integrate the communication of the independent components of heterogynous service management, while keeping decoupled architecture, where a common messaging model was used in this ESB based on a hybrid architecture solution. This enabled the quick integration and reduced the bespoke application development. ESB was able to moderately reduce cost and increase quality, efficiency and governance which were improved compared to Legacy Solutions.
Nevertheless, the ESB configuration required skilled resources, and each time when a configuration changed, the stakeholders in the community were not informed through the ESB Solution. Further, the Agility of the ESB Solution is very low, where service transition and deployment required considerable time to be expended. If a new version of was introduced, it resulted in service interruption. Therefore, in a dynamic service delivery environment ESB did not resolve the issues in service delivery.
In this scenario, Community members are provided with a tool to manage automated data interchanges and avoid double entry work.
• Main benefits:
Impact on Member's legacy IT system is limited. Running costs can be shared according to usage.
• Main constraints:
Configuration and maintenance of the tool is complex, more especially when data structure transformation and content translation is concerned. It generates loss in functional content
Effectiveness of the implementation much depends on the maturity of the system of individual partners. Low maturity at sender impacts all receivers.
Limited control on processes restricts continual improvement.
• Comments:
Definition of standard data structures/messages is a workaround to limit complexity of ESB/ETL configuration. It however transfers the charge to the data senders/receivers
The following communication related problems can be identified in the present Service Delivery domain a) Difficulties in handling the complexity of involvement of multiple parties/participant in the service delivery process. (Partners, Providers, Sub-Providers, Principals Etc.) b) Complexity of delivered services. (Services are composition of multiple sub services and bundled services containing service components of varied types). c) Rapidly Changing delivery models. (Services are rapidly changed depending on customer requirements, demand patterns, innovation, strategic requirements, government involvement etc.).
d) Complexity of activities involved in the delivery process. (Tracking of issues, Problems, Managing Releases, Managing Configurations etc.).
e) Streamlining of interaction of System Component and People Component. (Lack of defined processes).
f) Inability to maintain consistent communication between parties. (Lack of governance and management of communication, controlled communication).
g) Lack of adherence to a common set of standards, common terms, glossaries and other communication protocols.
h) Difficulties in integrating external systems and partner IT solutions to participate in the communication exercise.
i) Service continuity, delays, inconsistency, unsuccessful implementation and other issues of current IT Solutions and Tools.
j) Difficulties in representing and handling real world business data structures and process models with existing IT Solutions and Tools with required consistency, efficiency, effectiveness, time and cost.
A SUMMARY OF THE PRESENT INVENTION
The present invention; the Business Process Meta-Model, takes a different approach in resolving the Service Delivery Challenge , where, by means of an agile platform - the Community Integration Framework, it facilitates the management of the above complexity, through the deployment of processes, data models and controlled exchanges between organizations. Compared to the aforesaid Legacy and Hybrid ESB solutions, the present invention creates a successful communication model to the community, successfully integrating systems and the people in the community.
The present invention would enable organizations to organize and manage their IT interactions and communication within a community of multiple and different partners as part of their business process. The present invention provides governance to control the interactions within the community, thus enabling the organization to act as the central exchange, for the information for all the players within the community. The present invention consists of a configurable toolkit, which reduces the typical application implementation time by introducing ready-made business content of domain specific global best practices, processes, and patterns for common scenarios. Furthermore, the present invention can be quickly and easily fine-tuned to an organizations exact need, which would otherwise take much longer time to build from scratch.
The present invention has the capability of executing multiple versions in a similar platform, the seamless service delivery and service transitions are also improved. This enables configuration and structure changes to the communication model to be done quickly in real-time, which improves the awareness in the community, achieved through the central communication model, coding standards, glossaries that are shared in the community, enabling participants of the community to understand business needs in a common language.
The description of the present Invention; the Business Process Meta-Model is found herewith from paragraphs [500].
DETAILED DESCRIPTION
[500]The invention is illustrated by way of example and not by way of limitation in the figures of the accompanying drawings in which like-references indicate similar items.
[500-1 ] Other components and configurations may be used without departing from the scope and the spirit of the invention as they were explained and discovered in the methodology/procedure.
[500-2] When specific implementations are discussed, it should be understood that this it's done for illustrative purposes and should not limit the invention to the limitations of the illustrations themselves.
[500-3] Various multiple technologies/tools can be used to achieve the same goals by answering the issues listed in the problem domain addressed, but still using the same methodology /procedure discovered in the present invention. In all such cases it should be identified that the methodology /procedure discovered in the invention is used in such arrangements to resolve the mentioned problems in the problem domain.
[500-4] The artifacts and elements used in combination in the invention provide solutions for the mentioned problem domain. The objective of usage of those in such arrangement could be varied depending on the platform/tools/technologies used to create the solution. In all such cases, it should be identified that they belongs to the discovered invention.
[501 ] In the following description, numerous specific details are set forth to provide a thorough description of the invention. However the invention may be practiced without these specific details. In other instances, well-known features have not been described in detail, so as not to obscure the invention.
[502] Although a diagram may depict components as logically separate, such depiction is merely for illustrative purposes. The components portrayed can be combined or divided into separate software, firmware and/or hardware components.
[503] In representing Enterprise Business Processes, data and configuration of communication models, it is required that various aspects of these modeling requirements to be done. Business Process Meta-Model (hereafter referred to as "Meta-Model") has basic constructs to facilitate all these requirements.
THE CORE META-MODEL
[504] To provide facilities to have a comprehensive Data Model which is dynamic and extensible, the model should be fully driven by a meta-data model. Once the meta-data model is designed, the actual data entry may be driven by that Meta-Model. (Entered data complies with meta-data rules). [Refer Figure 3]
[505] The present Invention has a top level isolation, making it possible to create multiple instances [refer Figure 14] of models. Each model will be self-contained and completely isolated from others. This allows creation of multiple instances within a same organization or domain for varied purposes. (Example would be having Design, Development, Testing/Sandbox and Production instances for said purposes). Data Structures, Process Definitions, Users, Roles, Profiles and all related information will be stored under an instance to make this isolation.
[506] There may be a set of root level / top level meta level structures to identify logical separation of behavior and purpose of the data structures, created using the Meta-Model. (Examples: User, Partner, Location, Language and Title, Incident Record, Change Request Record structures). These logical structures provides the scope and containment functionality of each structure which could be created for specific purpose. Figure 23 specifies the top level meta-structure arrangement in an instance.
[507] Typical Business Structures being multi-level hierarchical, and these hierarchical structures can be represented by nodes in a parent child relationship arrangement. (Example: An employee having skills and hobbies can be represented by skill and hobby child nodes, making an employee node as a parent). Using this methodology, data structures with any depth and width can be represented. The information about these nodes is stored in a meta-node, which describes the actual node. In this arrangement, meta-node contains the information about the node. Figure 16 shows the capability to define multiple child nodes in hierarchical structures using meta-nodes.
[507-1 ] All meta level elements share some common set of fields for a given common set of functionality namely "ID", "Display Legend" and "Display Assistance".
[507-2] ID field is use as an auto generated unique key (primary key) for the table to store the elements. Depending on the database model used, ID field could be auto number field or a manually generated unique ID. which act as the primary key for the table.
[507-3] Display Legend specifies the wording / text, which could be used to show the name of the element when rendering data entry screens etc. Display Assistance is an additional field, which can store more 'help information' regarding an entity. Some elements such as meta- structure will only have display legend while some structures such as meta-data will have both. In this situation meta-data, display legend can be used as the caption for the text entry boxes and the display assistance can be used as the additional 'help text' to describe more details on how to enter the data, rules etc. The latter would be in a detailed and descriptive form, in a data entry form arrangement.
[507-4] Cardinality (number of maximum and minimum objects which can exists) of a given element (meta-node) in a particular position in the hierarchy which is recorded in CardMin and CardMax fields, which belongs to the meta-node object. When nodes are created to a given meta-node, it is possible to check and enforce these cardinality rules, so the nodes created will be in compliance.
[507-5] Display Sequence Order of a given element in multiple-element arrangement is specified using a filed named "SequenceNb". This filed can contain a number or a textual representation of how the order of the element ought be presented in multiple element scenarios. This is relevant to meta-node and meta-data. When specified in meta-nodes, the actual data nodes will be arranged in the specified order. When specified in meta-data level, the data columns / fields shown in the record will be ordered using this order. The use of this field is only for rendering/ presentation purposes only.
[508] As further detailed in the Identity and Access Management / Security, meta-nodes act as sub-structural component (Example: correspondent to sub sections of a hard-copy form) which separate functional aspects such as visibility, rendering, reading and writing permissions etc. Rendering engines can read the permission Read/Write/Hidden and render the details appropriately. Figure 6 shows the arrangement of sub-structural components of a typical form structure which the sections' visibility and accessibility can be controlled individual level.
[509] Representation of data elements for a given entity is done using meta-data. Entity object's properties/leaf level values/attributes are defined using the meta-data, which belongs to meta- nodes (Ex. For user object there could be properties such as First Name, Last Name, Address, Mobile etc.). Figure 17 shows the meta-data representation of a sample meta-node.
[509-1] Special selected Meta Data can be designated as a "Display Name" field. Data element related to this field is used to show/list when the objects of this meta-node type are listed in a list box or a combo box. For example if there is an Employee object created using Employee meta- node and there are Employee No, Employee Name and Salary meta-data, and one field such as "Employee Name" can be selected/ marked as "Display Name" field. Whenever an employees need to be listed the Employee Name can be used to list them.
[509-2] Type specification of data stored in records can be specified in the meta-data level using specific fields "DataType", "Length", "FormatSpecification" and "ControlType" attributes. The DataType field can contain the allowed data type for this particular data element. Possible values can be Numeric, Text, Character, Date/Time, Long Integer, Double but not limited to the same. FormatSpecification field could contain the specific formatting rules such as number of decimal places, allowed set of characters and other similar rules depending on the actual implementation required. ControlType field can be used to specify the type of User Interface control to be used when rendering in a data entry forms and user interaction components.
[509-3] Rules can be defined to specify whether null or empty values are allowed in the data entry. This is also stored against meta-data level. "ValueNullYN" and "ValueEmptyYN" fields are respectively used for the purpose.
[509-4] Default values can be specified in the meta-data level as well. These defaults will be used in the data when they are initially created (before performing any explicit settings to data).
[510] Once a business structure is created according to the desired hierarchy using Instance, Meta-Structure, Meta-Node and Meta-Data, it is possible to render a data input screen based on the structure and once the data is entered those can be received in the same form to represent each instance of the structure using nodes and data objects. Exact structure of meta-nodes (including child meta-nodes) and meta-data will be followed by the node and data structure arrangement when saving to database tables. Figure 18 shows the meta-data representation of a sample meta-node [51 1 ] It is possible to extend the behavior of meta-node by defining types to be associated with a meta-node. This is achieved by using qualifiers. Qualifiers are list of qualities from which one can be assigned to a node to specify a specialty or type. Qualifiers are defined by meta- qualifier and attached to meta-nodes and assigned to nodes when nodes are created. Figure 19 shows the diagram of meta-qualifier and qualifiers connected to a specific meta-node / node arrangement. Figure 20 shows the usage of qualifier to qualify a node by providing specialization.
[512] One single meta-node can be marked as; it can have its own hierarchy by making same type child nodes. Figure 21 represents the graphical notation for such arrangement. This allows users to create hierarchical data elements of same type. For example location can contain locations as child nodes in a real-world data modeling requirement. There could be any number of levels (depth) of locations in parent-child type of arrangement. This can be named as a recursively repeating meta-node. A special data field called "HierarchyYN" is used to keep this state whether a node can be recursively repeated or not in a hierarchy.
[513] Further to the arrangement of recursively repeating meta-node, additional qualifications for levels can be added using qualifiers for each level. In this arrangement even though the same meta-node is repeatedly used, different levels will represent different types of elements. In the same example in [512], location can be specialized by Country, City, Region, Building qualifiers which will give more specific meaning to each level of the meta-nodes. Even to enhance this functionality qualifiers can be arranged in a hierarchical structures to it will govern the hierarchy of recursively repeating meta-node by giving set of hierarchy rules. This is shown in Figure 7.
[514] However, it is not mandatory to have a meta-qualifier for meta node or a qualifier for a node. Meta-nodes and nodes can stand independently without being qualified by any.
[515] Functional keys can be defined for both meta level items as well as data level items. The purpose is to uniquely identify a meta element or data element which are stored in the Meta-
Model.
[516] A special FunctionallD field is used for all meta level elements (Instance, Meta-Structure, Meta-Node, Meta-Qualifier and Qualifier) to uniquely identify an element which can be used in all references to those elements.
[517] A data field or multiple data fields (meta-data) can be marked using a special attribute called "Contribute To Key" to denote that field will contribute to form a composite key for the data stored under a given meta-node. The marking is done at meta-data level. When data is stored representing a given meta-node, the functional key is calculated or computed according to the marked data elements which contributes to key. This value can be directly stored in the "Node" object for fast query purposes. The functional key scenarios for data elements are demonstrated in Figure 22.
[518] When there are no fields marked as "Contribute To Key" then parent functional id, the default display value of the field can be used as the unique functional key and qualifier can be in combination to narrow the searches. In some cases multiple objects can exists when there is no unique key constraint specified. The functional key scenarios for data elements are demonstrated in Figure 22.
[519] The information about changes to records can be stored in records themselves forming audit information. "Date Created", "Created By", "Date Last Updated Date", "Last Updated By", "Date Last Used" fields will hold the change information on all the meta elements as well as data elements.
[520] History of changes can be tracked by using the model's features themselves. All elements have common set of attribute to support this function. "Valid From" and "Valid To" fields are used to keep the validity period of elements. Each time an element is changed, the previous element will be invalidated by setting the validity period, and a new element is created using a new validity period which will effect from starting the modified date / time. This will allow having two functional features. One is to have multiple copies of the same structure and keep the history. This concept is true for all the elements in the Meta-Model including Instance, Meta- Structure, Meta-Node, Meta-Qualifier, Qualifier, Meta-Data, Node and Data. The other is tracking history of both meta level design changes as well as data level changes is possible using this design concept.
[521] A mechanism can be designed to refer various nodes from other nodes' data elements to store relationships (references). References can be from two basic sources,
a) Referring a node from any entity.
b) Referring a node from a data list (list of values)
Even though the purposes look different, the implementation is same for both, except the latter requires creating a list of values (or list of items) and providing the user to select an item to be referred.
Referencing other nodes is achieved in data level by defining a link (which is done by providing the key of the referred item) to the item referred. This is achieved by allowing the Meta-Model to have a key to other node in the data record. To specify the possible nodes which can be referred by a data element, is specified in the metadata level. In the meta-data it is possible to define a referred meta-node denoting which type of nodes can be referred. Reference mechanism is illustrated in Figure 4 and Figure 5.
PROCESS MODEL
[600] A generalize process model can be defined using the same Meta-Model concept. Status, Transaction, Trigger, Business Rule, Condition, Case, Operator, Standard Routine, Standard Routine Parameter, Macro and Macro Parameter objects can be used to define Processes and Workflows.
[601 ] To provide process functionality, it is required to define process artifacts. Since Meta- Model provides a way to design custom data structures which can be used to render graphical user interaction modules to accept such input content / data in the method of form based input screens, it is possible to use meta-node based hierarchical structures to be involved in process and workflow definitions. These act as "Form objects" which is used to record data.
[602] Forms being the entry point for process workflows, it is possible to represent an entry point to a process by creating a new "form object" (creating a new form using the meta structure for the form) and entering data to it.
[603] A "form object" (which is a node hierarchy with data associated to each node) can associate with a set of all possible "status" values for that particular form. (Status values are associated with a specific meta-node which will be available to assign to a node.) The very first/initial status can be stored in the meta-node itself (start-up status of the form in the workflow sequence). Figure 23 shows the relationship of "Status" with "Meta-Node".
[604] The Node Status table contains the various status values which are assigned to a given node at various times in the workflow process. The sequence of statuses which node had been associated is marked with the sequence numbers.
[605] Transactions being the entry points of the processes and workflows, transaction can be configured to setup a process/workflow. Transaction entity contains the information about transaction, and can be connected to security / accessibility to control who can access to that particular transaction. Transactions will directly invoke the trigger configured to it.
[607] According to the arrangement in the Figure 8, Triggers are of several types (based on Button press, ESB even, Business Event etc.). Once trigger is invoked by the appropriate invoker, the trigger will check the configuration setup for its execution. A single trigger can have multiple business rules to be evaluated and based on the evaluation there can be macros which can be configured to execute. A trigger will set the appropriate status of a node being addressed, when executed. Standard routines can be written to generally invoke event by creating and calling a trigger even within business processes making it possible to install dynamic custom event handlers.
[607-1 ] Even though the Meta-Model provides provision for configuring various types of triggers of varied nature, it is the implementers' decision to couple triggers to actual system events (of the system being developed). Implementer can setup triggers for Form events such as On Load, Before Save, After Save, On Business Process Start, After Business Process Start, On Business Process End, On ESB Message Receive etc.). It is the responsibility of software solution designed using the concept to call appropriate triggers.
[608] Business Rule is a composition of multiple conditions which comprise of multiple cases. Business rule can invoke one or more macros based on the evaluation status of the set of conditions which the rule comprises of.
[609] Condition contains a single logical statement (of multiple cases) which needs to be evaluated to satisfy the business rule it belongs to. Conditions can be grouped and combined using logical "AND" and "OR" operations as well as they can have a sequence number which the order of evaluation will take place.
[610] Condition contains multiple "cases" which are the smallest logically evaluable segment. Case contains two operands and an operation to evaluate to contribute to the condition it belongs to. Cases also contain the sequence number to determine the order of evaluation.
[610-1 ] Case has a Varl JD parameter which can be set as the first input parameter. In typical cases this could be a parameter of the id of data (concrete instance of meta-data). Var2_ID also could be an id of a reference data or a fixed value. This type is defined in the Var2_Class parameter. (Possible class values for var2 are Data Reference, Fixed Value).
[61 1 ] Operators for cases are listed in a separate table. The operands listed in cases are evaluated using the operator.
[612] Standard Routines are sub-routines which can be developed using a computer programming language of choice. These sub routines can be configured under "Standard Routine" table and their parameters can be configures to "Standard Routine Parameters" table. Standard routine is capable of doing / executing a defined task which it is programmed to do. These standard routines can be developed not limited to but from simply updating a data element of a given node to execute an .EXE file to send an SMS using a gateway to pass some data to a web service. [612-1] The Standard Routine has a parameter called "Executable Component" which specifies what to execute as this standard routine. Value is implementation dependent and can be a specific function or procedure from an executable code library.
[613] Once the conditions associated with a given business rule is satisfied, the Macros setup for the business rule can be executed. Macro is an instance of a Standard Routine which is setup for the given business rule. Each macro contains multiple Macro Parameters of concrete values for standard routine parameters. Macros have their own sequence numbering to define the order of execution.
[613-1 ] Macro Parameters are the concrete values passed for each Macro; intern will call a Standard Routine. Each parameter has "Variable ID", which is the reference of the id of the data (concrete instance of meta-data). "Variable Class" will specify whether the value is a reference or a fixed literal. (Possible class values for var2 are "Data Reference" or "Fixed Value"). In case of variable class is "Fixed Value", the fixed value is set to the "Variable Value" field.
[614] When a node is in a specific state specified in the "Node Status" tables most current and valid status for that particular node, it is possible to check whether elapsed time period from node becoming to that status is greater than the configured maximum duration and if true, an alternate path trigger can be executed to change the flow of the node.
CONTINUOUS RELEASE MODEL
[700] As solutions to items in problem(b) and (i) in the Problem Domain aforesaid, the software tool can be built based on the concepts invented in the Meta-Model by means of a continuous release process to support service continuity and prevent service breakdowns.
[701 ] The whole concept lies based on running multiple versions of elements in a single environment. Multiple versions of data structures, processes, and configurations running on a single environment support smooth transition.
[702] To enable running of multiple versions, it is required to have a mechanism to select the active process, data structure or configuration item at a given moment of time. This is achieved by using an active date range for each element including all the data tables discussed in both Meta-Model and process model. (This includes Instance, Meta Structure, Meta Node, Meta Data, Meta Qualifier, Qualifier, Node, Data, Transaction, Trigger, Business Rule, Condition, Case, Operator, Standard Routine, Standard Routine Parameter, Macro, and Macro Parameter).
[703] "Valid From" and "Valid To" date/times are introduced commonly to all the elements of the Meta-Model to enable what version is valid at a given moment of time. Whatever component (programming component/software or any other consumer) which accesses an above element should use "Valid From" and "Valid To" dates to access the appropriate version of the element.
[704] Co-existence of different releases in the same environment means same meta-node or meta-data etc. could have two versions running on the same instance with different validity periods related to different releases. Releases are deployed to the same environment but can be appropriately accessed by the programs and other consumers in a way that only the currently active (effective) release will be selected using the said date filter [703].
[705] The Meta-Model supports copying and creation of full or partial data structures, processes and configuration elements by providing a cloning mechanism. This cloning can be used to serve 2 purposes,
a) To create clones to build enhanced application or business content by using as a base.
b) To share best practices between members of the community, by sharing elements (which represents structures, processes etc.)
[706] It is supported to rollout of releases to base release (which the release was originally taken as a clone) without service interruptions (hot patching). This supposes capacity for parallel running of multiple versions of releases inside an instance).
[707] This capability of cloning, copy-pasting full or partial content also allows incremental creation/development of business content in an agile manner.
[708] The functionality of [707] provides the ability to build multiple layers of business content in an evolutionary manner by allowing to have layers such as "Base Layer", "Community Layer", "Domain Layer" and "Application Layer" etc. but not limited to.
[709] The functionality of [707] also provides the ability to implement various releasing configurations such as "Major", "Minor" and "Patch". Also hot patching is capable as it does not require shutting down the system for applying a patch as it supports the continuous application of releases.
[710] Figure 9 shows the effect of adding an additional field "Contact Details" to the customer data structure. In this scenario a new meta-data element is added with a different "Valid From" date. All the data related to the previous structure will exist as they were and when the new data is entered from the effective date of the new data element they will be stored against the newly added meta-data ("Contact Details"). When the system accesses the old structure only the old data will be used, but when the system is accessing new structure the new data elements also will be retrieved accordingly. (This is as old processes only know old data and new processes know new data.) [71 1 ] Figure 10 shows the effect of deleting an existing data field "Contact Details" of the customer data structure. In this scenario the meta-data element is deleted with setting the "Valid To" date to an appropriate ending date for the meta-structure. All the data related to the previous structure will exist as they were and when the new data records are entered to the structure only the valid data elements will be entered / prompted. (This is as old processes only know old data and new processes know new data.)
[712] "Change" of an existing data element is implemented combining the techniques [710] and [71 1 ]. When there is a need for a change to a data element, previous element is marked as "deleted" and a new element is created with the required change. However extract-transfer and load mechanism is required to set the existing data compliant with the change applied and referring to the newly created data element.
[713] For a system to run properly, it is required to access the proper data structure version by relevant process. By implementing that, it provides the capability to run multiple process versions in parallel, accessing the relevant multiple data structures appropriately. Figure 23 shows this arrangement using the same Meta-Model discussed so far.
[750] It is required to have the complete consistency of old data with newly changed data structures to function a system properly. For this purpose, the Meta-Model supports consistency auditing facilities by providing required information.
[751 ] It is possible to check each meta-data currently active, against the existing data which represent corresponding meta-data. All the rules applicable should be tested against the old data with the new met-data to find if there are any inconsistencies has occurred during the modification cycle. (Modifications are done to an instance and its configuration with the intension of releasing a new version of the model.)
[752] Future date and time should be used for all the new changes to the Meta-Model while performing the consistency auditing to make sure they will not be active while the process is running.
[753] Once the auditing is completed and the inconsistencies found it is possible to transform any inconsistent data to be consistent and match to the rules of corresponding meta-data. Then the activation dates ("Valid From") can be set of the meta-data to the required date which the new release should be published. MULTILINGUALISM MODEL
[800] Multilingualism support can be easily implemented using two additional tables to the Meta-Model. The multilingualism feature supports entering literals in multiple language as well as users to view content using their preferred language if available.
[801 ] According to the arrangement shown in figure 11 , additional tables are introduced, namely "Language" and "Translations".
[802] "Language" table has Language ID, Language Name fields.
[803] "Translations" table has Language ID, Resource Type (0=System Literals, 1 =lnstance, 2=Meta Structure, 3=Meta Node, 4=Node, 5=Meta Qualifier, 6=Qualifier, 7=Meta Data, 8=Data, 9=Transaction, 10=Trigger, 1 1 =Business Rule, 12=Condition, 13=Case), Resource ID, Translation , Translation2 fields.
[804] When a translation is to be stored, first the id of the Resource Type which needs to be translated is selected. Then the Language ID and Resource Type are selected, and Translation 1 field contains Display Legend and Translation 2 field can contain Display assistance (for all meta type elements). The node will have the display column as the Translation 1 and Data will have the Value as Translation 1 . For both Node and Data there will be no Translation 2 value.
[805] When retrieving a translation for a given users preferred language (User language preference is stored in user table which is described in identity access management), Language ID, Resource ID and Resource Type are selected based on the required resource and the users preferred language. When the results are received, Translation 1 and Translation 2 columns are to be read for the values.
[806] Meta-data contains a special field "ValueTransation" to specify whether the field is allowed to be translated in multiple languages. Transactions are permitted only if the "ValueTranslation" filed is set to TRUE/YES at meta-data level.
AUDITING AND LOGGING MODEL
[850] Auditing of meta-elements are achieved by the built in facility of having all the different version of meta-elements having in the same environment. (Including the deleted once) As per the continuous release model all the changes are done to copies of meta-elements and will be stored in an inactive state by setting the appropriate Valid From and Valid To dates.
[851 ] Auditing Program can access this information to produce a report of all the history values of each meta-element which are in the inactive state to find the log of change history. [852] As all the data changes are stored in copies of nodes and data, they also will be available to track the history of data changes. A report can be produced to list all the changes done to nodes and their data. This report will provide the required transaction history.
IDENTITY AND ACCESS MANAGEMENT MODEL
[900] Accessibility to Meta-Model elements is done using Identity and Access Management Model. The table structure is shown in the figure 12.
[901 ] Users can have multiple profiles and Profiles will be associated with multiple Users. The objective is to group a set of common types of profiles to share common rights.
[902] Roles are used to collect set of permissions for a given scope which role is in control.
Transactions are listed under roles according what a role can performed/allowed.
[903] Once roles are defined, the roles can be assigned to profiles according to the intended function of each profile. In this case a single Role can be assigned to multiple Profiles as well as a single Profile can have multiple Roles to derive its available/allowed Transactions.
[904] Transactions being the entry points to process model (refer [600]), the security permissions set here in the Identity and Access Management model will control the accessibility of functionality for users.
COMMUNICATION AND CONNECTIVITY MODEL
[1000] It is possible to develop a set of Application Programming Interfaces (API) (Example: a REST (Representational State Transfer) based API set) to access the functionality of concrete implementation of the Meta-Model and its content. By providing an API set (using a suitable set of software tools of choice) it will enable the ability to communicate between Meta-Model implementation (in the form of software) and external services. Implementation will provide functionality to push and pull data based on configured setup.
[1001 ] Meta-Model being a dynamically configurable system, it is possible to create structures to represent various automation interchange end-points. By providing communication API, it will be able to create a consistent communication models to integrate various participants including external systems and legacy systems in to a common platform.
[1002] Once the API set is created to access the Meta-Model and its content, it is also possible to generate user interface screens to represent and access the Meta-Model and its data by enabling the communities to access to the communication defined by the model's concrete business content, making it a tool to support community integration. Forms (using data structures) such as Incident, Problem Form, Configuration, Issue Tracking, Employee Time Sheet Entry but not limited to, can be created.
[1003] Process model can be used to implement workflows using the forms created, rules and interfacing contracts can be defined as well as the integration systems.
[1004] While providing the Form, Workflow, Business Rules functionality, the concrete implementation of a Meta-Model will also provide the continuous releases, multilingualism, consistency auditing and transaction auditing functionality parallel and simultaneous, providing the missing essentials in today's models in one unit.
HOW THE PRESENT INVENTION; THE BUSINESS PROCESS META-MODEL, ADDRESSES THE PROBLEM DOMAIN.
This Business Process Meta-Model was designed to address, but not limited to the community integration, communication and overall service delivery process management related problems and challenges exists in the Service Delivery Domain, which have been set out hereinbefore: a) Difficulties in handling the complexity of involvement of multiple parties/participants in the form of Partners, Providers, Sub-Providers, Principals etc., in the service delivery process;
The present Invention represents various communities and their participation by means of flexible Meta-Model based data structures to define such communication models.
Using the functions of Meta-Model discussed on item [504] to [521 ] it is possible to create data structures which represent the participants of the community and the relationships of them, in the form of business content. The flexibility of the Meta-Model allows creating required data structures to represent partners, their relationships as well as communication structures and the communication paths using the elements described.
Also data structures can be created to represent elements such as services provided by service provider in a business scenario to cater today's complex service structural requirements of any hierarchy, any number of sub levels, nodes and containing any number of data elements. Meta-nodes can be used to create elements to represent each partner and meta-node relationships such as "parent child hierarchy" and "references" can be used to define the relationships between them.
Forms can be created using required data structures to form communications.
For all entities including partners, communication structures and other required entities, their properties can be designed using meta-data. (addressed further in the solution to issue 'a') b) The inherent complexity of delivered services, as such services are compositions of multiple sub services and bundled services, which comprises of service components of varied types;
The present invention addresses and resolves this by representing and managing the complex service delivery models using this novel Meta-Model.
Elaborating further, the present invention; The Business Process Meta-Model, provides a way to represent various communities and their participation by means of data structures using the flexible Meta-Model. Also it supports definition of required communication models, and allows the management of complex service delivery models using the said Meta-Model.
Once partners (or any required organizational structure) are created and communication structures / forms created, it is possible to define the flow of communication using the process model described in [600] to [614] found herewith. Using defined transactions for form flow, it is possible to define business rules to process the form, or move the form in the communication channel making is possible to define flexible communications to addressing issues (a) and (b) aforesaid. c) The issues arising with rapidly changing delivery models due to customer requirements, demand patterns, innovation, strategic requirements, government involvement etc. might be required to be rapidly changed;
Through this Invention; The Business Process Meta-Model, it allows designers to change the delivered services rapidly using graphical Meta-Model designing tools, and it also comprises with the ability to implement this in a continuous evolving method. Using the Meta-Model in the present invention, it is possible to change the structures without requiring additional programming support or any other additional software writing. This is by providing a way to dynamically create and change data structures for rapidly changing needs. These rapid changes can be applied to a live solution by using continuous release management detailed in paragraph [700] found herewith. d) The complexity of activities involved in the delivery process, which is related with tracking of issues, problems, managing releases and managing configurations;
The present Invention; the Business Process Meta-Model, provides business best practices, in the form of ready-made data structures and processes to solve common business problems. Through this invention this problem is resolved by various functional content being created, using the above core Meta-Model and process model in order to create solutions / business support functions such as problem tracking, issue tracking, release management, change management etc. e) The problems associated with streamlining of interaction of System Component and People Component, which is due to the lack of a defined processes;
This present Invention; The Business Process Meta-Model, resolves this by having the capability to define processes to streamline communication of the participants of the respective communities. f) Inability to maintain consistent communication between parties, which is the result of lack of governance, management of communication and also the lack of controlled communication;
This present Invention; The Business Process Meta-Model, provides a mechanism to implement a defined and controlled communication between participants to provide consistent communication model.
Elaborating further, the aforesaid problems (e) and (f) are addressed and resolved by the present invention using a defined and consistent data structure, to represent every entity in the communication process, where it is possible to streamline the communication of these participants involved. These processes can be defined to specify the rules and how the communication and interactions are managed within other processes, people and systems enabling, defined and controlled communication between these participants.
In addition, the present invention uses defined and consistent data structures to represent every entity in the communication process; hence, it is possible to streamline the communication of these participants involved. These processes can be defined to specify the rules, how the communication and interactions are managed within other processes moreover the people and systems, which enables a defined and controlled communication between these participants.
g) Lack of adherence to a common set of standards, common terms, glossaries and other standardized communication protocols;
This present Invention; The Business Process Meta-Model, has the ability to define common standards, glossaries, business structures, terms and share them within the community to interpret concepts same in the form of a common language.
This is done using the core Meta-Model and process model by creating standards on communications such as paths, escalations, reporting hierarchy, form and presentation layouts, standard value sets / List of values (LOVs), Terminologies and Glossary term lists etc. This allows the participants to work on a same ground and speak a common language which everyone can understand from the point of standards, terms and vocabulary.
h) Difficulties in integrating "external systems" and "partners' IT solutions" to participate in the communication exercise;
The Present Invention; The Business Process Meta-Model, provides facilities and infrastructure to connect External/Partner IT systems to participate in the communication by providing a simple and easy integration platform.
Integration with external systems requires data exchange with varied formats. These structures may be created and implemented in a software environment to communicate with them. Through the present invention, it is possible to create these structures using Meta-Model to facilitate the information interchange. Using the core Meta-Model [504] and the process-model [600] found herewith, it is possible to create structures to represent these complex data interchange models. Once done, an automated service can be created to use the Meta-Model along with the services layer specified in paragraphs [1000] - [1002] herewith, with the support of an ESB to communicate messages (which contain the required data according to the created structures). Service continuity issues, delays in transitions, inconsistencies in communication, unsuccessful implementation attempts and other issues of current IT Solutions and Tools;
The present Invention; The Business Process Meta-Model resolves this issue by the management of automated releases and continuous releases of system changes in an un-interruptible manner, by applying changes to systems to align with the rapidly changing business by providing smooth transition cycles.
Most of service continuity issues, service breakages, delays in transitions, inconsistencies, unsuccessful implementation and other related issues arise when new IT tools or new versions of the same are introduced to implement changes required. These changes require implementation of new data structures, processes and communication mechanisms along with the respective configurations required. Implementation of such changes typically involve additional programming / program rewriting / program changing, processes changing etc. Service breakdowns of long periods can occur due to lengthy transition cycles. The present invented Meta-Model supports continuous application of releases, [as provided below in paragraph 700] while the system is in operation without breaking the service or having a service down time when addressing and resolving this issue.
Furthermore, the present invented Meta-Model supports consistently a way to define and manage business processes and their changes effectively, efficiently, with less time and low cost to address this problem. The agility in designing in this present invention allows effective designing of business processes while the Continuous Release Model and Consistency Auditing Model provides an efficient, fool proof deployment of business processes with a guaranteed up time. Through the present invention, by elimination of additional programming effort required for changes, can significantly reduce time and cost associated in change implementations.
Difficulties in representing and handling real world business data structures and process models with existing IT Solutions and Tools with required consistency, efficiency, effectiveness, time and cost.
The Present Invention; The Business process Meta-Model has the ability to define and handle real world business data structures, processes using a highly dynamic, configurable Meta-Model with minimum engineering effort, consistency and short time span. Furthermore, these processes are developed in an agile manner.
The aforesaid problems in the context discussed, are addressed and resolved by this invention through the provisioning of combined solutions for common problems but not limited in the service delivery domain. There are products, which would provide part of these services in isolation, but not as a single solution, wherein this invention is able to resolve the same as a single business solution. Usage of combination of multiple suitable products available, can be an alternative to this invention, however, the same would be burdened with heavy costs. Moreover, the effort and the complexity of such integration would be high, with the ultimate requirement of consistency and robustness being not achieved.
In addition, using multiple products in the form of a highly complex solution with many different tools and solutions providing services to overcome the problem creates an extra complexity to be managed. A comprehensive solution such as the present Invention makes it a single, integrated, comprehensive solution which can easily implement, adapt and incrementally develop to suit organizations requirements within short time spans.
The following numbered paragraphs provide various aspects of the invention.
1 . A specific Meta-Model to facilitate modeling of,
a. Hierarchical data structures with structural relationships,
b. Data structural rules of cardinality, hierarchy and recursion of elements, c. Data structure groups with logical isolation according to usage,
d. Configurable data, formatting and rules of data,
e. Structural elements with qualifiers to specify the extended behavior of structure elements,
f. Structure elements with functional permissions,
g. Structural elements with audit information in the model itself,
h. Version information and history self-tracked by the elements themselves of the structures
i. Ability to store multiple version of the same structure element in the structure itself j. Allowing existence of multiple released versions of same structure providing consistent and independent access to required versions of the structure. A specific process model extending the Meta-Model in paragraph 1 further comprising, a. Configurable data structures and processes to define business processes, just by using configuration only (without writing any programs, re-compilation or any other programming tasks) while providing, b. The ability to define transactions which are invoked by various triggers, while providing facilities to definition of business rules which are executed based on evaluated criteria, where criteria created by conditions and their cases, while providing, c. The ability to execute defined standard routines with dynamic parameters, where standard routines available for required common activities to be performed. A Meta-Model according to paragraph 1 , further comprising the intension to keep the business data structures, processes and rules. A Meta-Model according to paragraph 1 and paragraph 2, further comprising the ability to copy paste of business data structures, processes and rules from one instance to other. A Meta-Model according to paragraph 1 , further comprising the capability to hold multiple active versions of the model in the Meta-Model itself in the form of co-existing versions of the same structural element. A Meta-Model according to paragraph 1 , further comprising the ability to run multiple release version of the same Meta-Model in a single platform in a harmonized manner without effecting data or functionality of each version. A Meta-Model according to paragraph 1 , further comprising a hierarchical model with functional permission on different levels of structures for given roles and profiles, 8. A Meta-Model according to paragraph 1 , further comprising auditing and historical change information inside the structure themselves, without the need for additional auditing mechanism to track audit information, while providing,
9. A Meta-Model according to paragraph 1 , further comprising the capability to audit consistency of existing data with modifications done to Meta-Model to fix any consistency issues, while providing,
10. The capability to store necessary data, processes and configurations in multiple languages, while providing,
Overall Concept Aspects
1 1 . A Meta-Model and a process model according to paragraph 1 , further comprising the intension to keep ready-made best practices. The design supports capability to carry ready-made business content in the form of processes and structures which can be immediately applied. This provides the best practices, processes as ready-made content to quickly transit and implement for the service delivery bringing quick wins.
12. A Meta-Model and a process model according to paragraph 1 , further comprising the intension to supports sharing of existing best practices with interested members in the community. This sharing facility of practices is built in to the Meta-Model and process model with release management to provide the necessary functionality to copy and paste business best practices between interested parties without interrupting services.
13. An integration platform with ESB capability combination with a dynamic data and process model and a set of business processes to form a flexible communication toolset to quickly implement integration solutions to connect partners in a community as well as systems while providing the dynamism at the same time keeping the service continuity when system changes required.
14. An integration platform according to the paragraph 4 further comprising the facilities to provide un-interrupted operations and running of business processes and structures of the business content using a contentious release management by allowing running multiple versions of same processes or data structures simultaneously making is possible to smoothly transit from one version of the business content to other without service interruptions with zero down time.
15. An integration platform according to the paragraph 4 using the concept presented in claim 1 , further comprising a method of applying changes to the processes and application data structures without program recompilation by using a purely configuration based data modeling.
16. An integration platform according to the paragraph 4 using the concept presented in claim 1 , further comprising a method of applying changes to the processes and application data structures in an incremental manner to provide agile development of business processes while the system is continuously running.
17. A Meta-Model according to the paragraph 1 , while providing the above facilities, provide the in-built functions such as management of the Meta-Model itself, controlling the changes to the Meta-Model itself, and availability of audit information and the versioning of structures, processes and data in the Meta-Model itself making a self-tracked Meta- Model not requiring the support functions to be separately included.
18. An automatic consistency tracking system built in to the system itself with the data consistency auditing support when latter version has inconsistencies to previously existing data.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 3: Meta-Model Core Tables. This diagram shows the core tables of the Meta-Model.
Figure 14: Isolated instances. The Business Process Meta-Model has a top level
isolation making it possible to create multiple, independent, self-contained, self- functioning instances.
Figure 15: Specifies the meta-structure arrangement to represent data structures in an instance.
Figure 16: Hierarchical Data Structures using Meta Nodes.
This Figure shows the capability to define multiple child nodes in hierarchical structures using meta-nodes.
Figure 6: The arrangement of sub-structural components of a typical EDI Form (Electronic Data
Interface Form) which the sections' visibility and accessibility can be controlled at individual level.
Figure 17: Meta-data representation of a sample meta-node Figure 18: Meta-node to Node and Meta-Data to Data relationship.
Figure 19: Diagram of meta-qualifier and qualifiers connected to a specific meta-node by providing specialization to the data nodes that meta-node represents.
Figure 20: Depicts the usage of qualifier to qualify a node by providing specialization.
Figure 21 : Represents the recursively repeating meta-node arrangement and notation for such arrangement. This allows users to create hierarchical data elements of same type.
Figure 7: Usage of Qualifiers to enhance the recursively repeating meta-node's hierarchy.
Functionality qualifiers can be arranged in a hierarchical structure which governs the hierarchy of recursively repeating meta-node by giving set of hierarchy rules.
Figure 22: Functional Keys scenarios for Node/Data arrangements. In some cases multiple objects can exists when there is no unique key constraint specified. The functional key scenarios for data elements are demonstrated here.
Figure 4: Definition of References. In the meta-data it is possible to define a referred meta-node denoting which type of nodes can be referred. Reference mechanism is illustrated in Figure 4. Figure 5: Implementation of References.
To specify the possible nodes, which can be referred by a data element it is specified in the meta-data level. In the meta-data it is possible to define a referred meta-node denoting which type of nodes can be referred. Reference mechanism is illustrated here which is similar to Figure 4.
Figure 8: Process Model - Part 1 - Meta-Nodes, Nodes and Associated Status.
Figure 23: Process Model - Part 2 - Triggers, Transactions and Business Rules.
Figure 9: Release management - Add a new meta-data element.
This figure shows the effect of adding an additional field "Contact Details" to the customer data structure. In this scenario a new meta-data element is added with a different "Valid From" date.
Figure 10: Release management - Delete existing meta-data elements.
This shows the effect of deleting an existing data field "Contact Details" of the customer data structure. In this scenario the meta-data element is deleted with setting the "Valid To" date to an appropriate ending date for the meta-structure.
Figure 23: Release management - Process Versions accessing Data Structure Versions.
For a system to run properly it is required to access the proper data structure version by relevant process. By implementing that it provides the capability to run multiple process versions in parallel, accessing the relevant multiple data structures appropriately. Figure 23 shows this arrangement using the same Meta-Model discussed so far.
Figure 11 : Multilingualism Model. According to the arrangement shown in this Figure, additional tables are introduced, namely "Language" and "Translations"
Figure 12: Identity and Access Management Model - Accessibility to Meta-Model elements is done using Identity and Access Management Model. The table structure is shown in this figure.

Claims

DATA PROCESSING CLAIMS:
1 . A method of processing data, the method comprising:
receiving data to be processed;
receiving a model representing the data and relationships between the data, the model defining at least one rule associated with data represented by the model; and
processing the received data based upon the at least one rule defined by the model.
2. A method according to claim 1 , wherein the model represents a plurality of instances of the data, each instance of the data having associated validity data, wherein processing the data based upon the model comprises determining an instance of the data to be processed based upon the validity data associated with each instance of the data.
3. A method according to claim 2, wherein the validity data comprises a time, wherein determining an instance of the data to be processed based upon the validity data associated with each instance of the data comprises determining a relationship between the time associated with instances of the data and a predetermined time.
4. A method according to any preceding claim, wherein the model models a plurality of hierarchical data structures and relationships between the hierarchical data structures.
5. A method according to any preceding claim, wherein the at least one rule defines properties of the data.
6. A method according to any preceding claim, wherein the at least one rule defines properties associated with processing of the data.
7. A method according to any preceding claim, wherein the model defines a plurality of groups, wherein each group is associated with one or more data items represented by the model.
8. A method according to claim 7, wherein each group defines a logical isolation from other ones of the plurality of groups.
9. A method according to any preceding claim, wherein the model comprises a plurality of entities, the plurality of entities together representing the data and relationships between the data.
10. A method according to claim 9, wherein the model comprises a plurality of instances of an entity, each instance of the entity comprising data defining a version associated with the entity.
1 1 . A method according to claim 9 or 10, wherein a first plurality of entities represent data and relationships between the data and a second plurality of entities define rules associated with the data and processing of the data.
12. A method according to claim 1 1 , wherein each of the second plurality of entities has a relationship with at least one entity of the first plurality of entities, each relationship providing an association between a first entity and a second entity, wherein data represented by a first entity is processed based upon rules defined by second entities having a relationship with the first entity.
13. A method according to any one of claim 9 to 12, wherein each entity comprises data associated with creation and/or modification of the entity.
14. A method according to claim 13, wherein the data associated with creation and/or modification of the entity comprises data associated with a property selected from the group consisting of: a date the entity is created, a user that created the entity, a date the entity was last updated and a date that the entity was last used.
15. A method according to any preceding claim, further comprising receiving a further model, the further model representing data processing functionality and rules associated with the data processing functionality, wherein processing the received data comprises:
receiving data indicating data processing functionality to be performed on the received data; and
processing the received data based upon the received data indicating data processing functionality and based upon rules associated with the data processing functionality.
16. A method according to any preceding claim, further comprising receiving updated model data, and modifying the model representing the data and relationships between the data based upon the updated model data, wherein modifying the model comprises:
storing an instance of the model corresponding to the received model; and
generating an instance of the model based upon the received model and the updated model data.
17. A method according to any preceding claim, wherein the model models a plurality of qualifiers associated with data represented by the model.
18. A method of processing data at a plurality of data processing systems, the method comprising:
processing first data at a first data processing system according to any preceding claim to generate first processed data; and
processing the generated first processed data at a second data processing system according to any preceding claim.
19. A computer program comprising computer readable instructions configured to cause a computer to carry out a method according to any preceding claim.
20. A computer readable medium carrying a computer program according to claim 19.
21 . A computer apparatus for processing data comprising:
a memory storing processor readable instructions; and
a processor arranged to read and execute instructions stored in said memory;
wherein said processor readable instructions comprise instructions arranged to control the computer to carry out a method according to any one of claims 1 to 18.
PCT/IB2014/065652 2013-10-28 2014-10-28 Data processing WO2015063675A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
LK17419 2013-10-28
LK1741913 2013-10-28

Publications (1)

Publication Number Publication Date
WO2015063675A1 true WO2015063675A1 (en) 2015-05-07

Family

ID=52432854

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/065652 WO2015063675A1 (en) 2013-10-28 2014-10-28 Data processing

Country Status (1)

Country Link
WO (1) WO2015063675A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1991008543A1 (en) * 1989-11-30 1991-06-13 Seer Technologies, Inc. Computer-aided software engineering facility

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1991008543A1 (en) * 1989-11-30 1991-06-13 Seer Technologies, Inc. Computer-aided software engineering facility

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
IAN GRAHAM ET AL: "UML - a tutorial", 21 November 2001 (2001-11-21), pages 1 - 120, XP055176524, Retrieved from the Internet <URL:http://cc.ee.ntu.edu.tw/~farn/courses/BCC/NTUEE/2012.spring/uml_tutorial.pdf> [retrieved on 20150313] *
IAN GRAHAM ET AL: "UML - a tutorial", 21 November 2011 (2011-11-21), XP055176605, Retrieved from the Internet <URL:http://cc.ee.ntu.edu.tw/~farn/courses/BCC/NTUEE/2012.spring/uml_tutorial.pdfhttp://cc.ee.ntu.edu.tw/~farn/courses/BCC/NTUEE/2012.spring/uml_tutorial.pdf> [retrieved on 20150316] *
PATON N W ET AL: "Active database systems", ACM COMPUTING SURVEYS, ACM, NEW YORK, NY, US, US, vol. 31, no. 1, March 1999 (1999-03-01), pages 63 - 103, XP002354519, ISSN: 0360-0300, DOI: 10.1145/311531.311623 *

Similar Documents

Publication Publication Date Title
US10558642B2 (en) Mechanism for deprecating object oriented data
US10078818B2 (en) Work routine management for collaborative platforms
US9128996B2 (en) Uniform data model and API for representation and processing of semantic data
US7913161B2 (en) Computer-implemented methods and systems for electronic document inheritance
US7673282B2 (en) Enterprise information unification
US7577934B2 (en) Framework for modeling and providing runtime behavior for business software applications
US10778688B2 (en) Descendent case role alias
US8370400B2 (en) Solution-specific business object view
US10402308B2 (en) Sandboxing for custom logic
JP2004280821A (en) Software business process model
US10726036B2 (en) Source service mapping for collaborative platforms
KR20150106365A (en) Business rule management system with hierarchial rule structure and expression method threrof
US20120060141A1 (en) Integrated environment for software design and implementation
Schneider SAP Business ByDesign Studio: Application Development
Magnani et al. BPDMN: A conservative extension of BPMN with enhanced data representation capabilities
US8910183B2 (en) Access to context information in a heterogeneous application environment
Browne et al. IBM Cognos Business Intelligence V10. 1 Handbook
WO2015063675A1 (en) Data processing
US10067749B2 (en) Generating consumer-oriented APIs from a UI model
TWI620134B (en) Integration device and integration method thereof
US20230153363A1 (en) Systems and methods for enhanced content management interoperability services interfaces and repository integration
US11204908B2 (en) Augmentation playback
Fernandez et al. A domain engineering approach to increase productivity in the development of a service for changes notification of the configuration management database
Kanagasabai Towards online, collaborative, multi-view modelling using collabCORE
Chikwanda et al. Call center management systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14831071

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14831071

Country of ref document: EP

Kind code of ref document: A1