WO2002065232A2 - Method and system for facilitating assessments - Google Patents

Method and system for facilitating assessments Download PDF

Info

Publication number
WO2002065232A2
WO2002065232A2 PCT/US2001/051000 US0151000W WO02065232A2 WO 2002065232 A2 WO2002065232 A2 WO 2002065232A2 US 0151000 W US0151000 W US 0151000W WO 02065232 A2 WO02065232 A2 WO 02065232A2
Authority
WO
WIPO (PCT)
Prior art keywords
assessment
requirements
input
category
recited
Prior art date
Application number
PCT/US2001/051000
Other languages
French (fr)
Other versions
WO2002065232A3 (en
WO2002065232A8 (en
Inventor
Jonathan George Beers
Frank Heinrich Bakes
Original Assignee
The Procter & Gamble Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Procter & Gamble Company filed Critical The Procter & Gamble Company
Priority to AU2002251719A priority Critical patent/AU2002251719A1/en
Publication of WO2002065232A2 publication Critical patent/WO2002065232A2/en
Publication of WO2002065232A8 publication Critical patent/WO2002065232A8/en
Publication of WO2002065232A3 publication Critical patent/WO2002065232A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q99/00Subject matter not provided for in other groups of this subclass

Definitions

  • the present invention relates generally to a method and system for facilitating and standardizing assessments of business parties, and more specifically, in one embodiment, to the display of a hierarchical arrangement of assessment requirements and categories, and to the electronic entry, storage, analysis and reporting of ratings and comments regarding the requirements and categories.
  • a company typically establishes relationships with a number of business parties for supply of goods and/or services needed in carrying out the company business.
  • a company that manufactures a food product might receive various ingredients for making the food product from a variety of suppliers.
  • Companies have routinely assessed or audited such business partners in order to ensure the quality of the ultimate products produced by the company. For instance, if a supplier's food ingredient is contaminated, this may result in contamination of the ultimate food product produced by the company.
  • checklists and the like can be utilized to help memorialize the categories to be assessed, sufficient detail is typically not provided in such checklists to ensure that all pertinent requirements are met.
  • capability is usually not provided to quickly view only pertinent requirements for the particular party being evaluated, and safeguards are not provided to ensure that all pertinent assessment categories and requirements are evaluated and that the auditor does not overlook some of these pertinent categories and requirements.
  • one auditor may utilize a slightly different rating scale or evaluation system when compared to another auditor, or when compared to previous assessments made by the same auditor.
  • the lack of detail provided in any written descriptions which may be utilized for the various categories and requirements allows for frequent inconsistency as well as widespread subjectivity from assessment to assessment. Accordingly, it can be very difficult to compare assessments of various business parties, and it can also be difficult to compare multiple assessments of the same business party. Consequently, it can be difficult to determine how one party compares to another, whether a party's goods or services should continue to be utilized, and whether a party is making improvements in any of the assessed categories.
  • the comments and ratings provided by auditors in conventional assessments are usually very text intensive as well.
  • the auditor usually takes notes on paper or dictates ratings and comments into a conventional dictaphone. Once all the categories and requirements have been evaluated, the auditor then utilizes the notes and recordings to prepare a report regarding the business party.
  • preparation of such a report can be time consuming and inefficient, because organizing and properly presenting such notes can be complex.
  • the auditor in order to make the report understandable, the auditor usually must spend time to reiterate each assessment category and the requirements relating to that category before providing the evaluation for that category. As can be understood, such reiteration can be quite time consuming and inefficient, especially when there are a number of assessment categories and requirements.
  • the report format, content, and style utilized by various auditors can vary, compounding the difficulty in quickly comparing and reviewing multiple assessments.
  • Assigning ratings for each category which is assessed can also be subjective. While the auditor may identify many deviations regarding the various requirements within the category, the business importance of each specific requirement and its deviations may be difficult to determine upon completion of the assessment.
  • each assessment will typically result in a separate assessment report which is separately filed.
  • it can be difficult to aggregate, summarize, and or analyze information from various audits of the same or differing parties.
  • a method and system for efficiently evaluating business parties is needed.
  • it is desirable to have such a method and system which ensures that the auditor will evaluate all relevant assessment categories and that clear and standardized requirements are utilized in evaluating these categories.
  • Such a method and system is also needed which allows the auditor to quickly and efficiently enter evaluation ratings and comments, and which assists the auditor in quickly generating standardized reports from such ratings and comments.
  • a method and system is needed for easily accessing, summarizing, and analyzing various assessment reports which are generated.
  • Another object of at least one embodiment of the present invention is to enhance the ability to analyze and evaluate assessment reports regarding business parties.
  • a further object of at least one embodiment of the present invention is to provide easy access to assessment reports regarding business parties.
  • Yet another object of at least one embodiment of the present invention is to create a common database of assessment comments and ratings for use in identifying common trends and/or areas for improvement.
  • the above objects are provided merely as examples, and are not limiting nor do they define the present invention or necessarily apply to every possible embodiment thereof. Additional objects, advantages and other novel features of the invention will be set forth in part in the description that follows and will also become apparent to those skilled in the art upon consideration of the teachings of the invention.
  • a method is provided for obtaining the evaluation of the performance of a business party. The method comprises displaying a requirement corresponding to a business performance expectation, and displaying possible ratings to be selected for the requirement. The method also comprises receiving a rating input from a user indicating a selection of one of the possible ratings and saving the selected rating along with an identifier of the requirement to which it applies.
  • FIG. 1 is a perspective view of a handheld tablet PC and related devices, which can run software programs for use in assessing business parties, according to principles of the present invention
  • FIG. 2 is a flow diagram illustrating an exemplary automated method of assessing business parties, according to principles of the present invention
  • FIG. 3 illustrates an exemplary assessment setup screen, which can be utilized to receive setup inputs, such as assessment type, focus, objectives and the like, according to principles of the present invention
  • FIG. 4 illustrates an exemplary assessment category screen, which can be utilized to view pertinent assessment categories, according to principles of the present invention
  • FIG. 5 illustrates an exemplary hierarchical category screen, which can be utilized to view category standards and related subcategories and requirements, such as by selecting one of the categories of FIG. 4, according to principles of the present invention
  • FIG. 6 illustrates an exemplary assessment subcategory screen, which can be utilized to view subcategories and related requirements, such as by selecting one of the subcategories of FIG. 5, according to principles of the present invention
  • FIG. 7 illustrates an exemplary requirement rating screen, which can be utilized to enter comments and ratings regarding various assessment requirements, and which can be viewed by selecting one of the requirements of FIG. 6, according to principles of the present invention
  • FIG. 8 illustrates an exemplary requirement rating screen showing exemplary inputs of comments and ratings, according to principles of the present invention
  • FIG. 9 illustrates an exemplary requirement viewing screen automatically showing the ratings given to various assessment requirements, according to principles of the present invention
  • FIG. 10 illustrates an exemplary assessment category rating screen, which can be utilized for entering observations and ratings regarding various assessment categories, according to principles of the present invention
  • FIG. 11 illustrates an exemplary assessment status screen, which can be utilized to automatically view the progress of the assessment and to automatically summarize ratings assigned, according to principles of the present invention
  • FIG. 12 is a schematic diagram of a computer network, which can be utilized to access various assessment reports from remote locations, according to principles of the present invention
  • FIG. 13 illustrates an exemplary assessment report access screen, which can be utilized to view and select assessment reports to be viewed, according to principles of the present invention
  • FIGS. 14, 15, 16a, 16b, 19a, 19b, and 19c illustrate various exemplary assessment reports that might be generated using the assessment data collected and stored, according to principles of the present invention
  • FIG. 17 is a data table diagram illustrating an exemplary data structure for storing assessment requirements, related categories, subcategories, assessment types, objectives, and other data, according to principles of the present invention
  • FIG. 18 is a schematic diagram illustrating an exemplary process of organizing all performance requirements into a central database, selecting pertinent requirements based upon user inputs, storing observations and ratings for the pertinent requirements, and allowing access to reports and data generated therefrom, according to principles of the present invention
  • FIG. 20 illustrates an exemplary assessment properties screen, which allows the user to change the setup parameters and thereby change the pertinent requirements utilized during the assessment, according to principles of the present invention.
  • FIG. 21 illustrates an exemplary rating guide which can be utilized by the assessor in determining a rating for an assessment category, according to principles of the present invention.
  • DETAILED DESCRIPTION OF THE EXEMPLARY EMBODIMENTS Turning now to the drawings in detail, wherein like numbers illustrate corresponding structure, FIG. 1 is an exemplary arrangement comprising a perspective view of a portable computer and related devices, which can run software and/or firmware programs which operate according to principles of the present invention.
  • a portable computer 30 is utilized for viewing evaluation requirements and for providing evaluation inputs, such as ratings (e.g., scores, grades, rankings, etc.) or comments (e.g. observations, notes, opinions), regarding each requirement.
  • ratings e.g., scores, grades, rankings, etc.
  • comments e.g. observations, notes, opinions
  • the portable computer 30 includes a display 32 for viewing and inputting information regarding these requirements.
  • the display 32 could comprise a monitor, a liquid crystal display (LCD), a touchscreen display (which, in addition to displaying information, also allows the user to enter inputs by touching the display with a finger or with a special pen or stylus 33), and/or any other viewing screen or interface.
  • Any of a variety of devices may also be utilized for providing the evaluation inputs (such as text, ratings, descriptions, observations, graphics, photos, video and the like) regarding the party's performance during the assessment and or analysis of the assessment data.
  • a keyboard 36 and/or mouse may be utilized to communicate with the computer 30, via wired or wireless channels, to provide inputs to the computer 30.
  • the display 32 may comprise a touchscreen display for entry of inputs via a user's finger, a stylus or pen 33, or other pointing device.
  • a keypad 35 onscreen or hardware related, may also be provided to assist in entering data from the user.
  • a microphone 39 can be provided for use in conjunction with voice recognition software for providing inputs to the computer 30.
  • voice recognition software An example of suitable speech-to-text software that may be utilized is NATURALLY SPEAKING by Dragon Systems, Inc.
  • the microphone 39 may be part of a headset 38 which allows the microphone to be easily worn on the user's head during use, thereby freeing the user's hands.
  • the microphone 39 can clip to the user's body or clothing, may have a separate stand, or may be integral with the computer 30.
  • the microphone 39 can communicate with the computer 30 in any suitable manner, such as by plugging into a port 34 on the computer. Sound and video recording devices can also be utilized to record contemporaneous audio and video relating to the party's business performance with respect to various requirements.
  • the microphone 39 and/or other microphone or sound input devices can be utilized to record the relevant audio.
  • a camera 37 can be utilized, such as a digital camera or still video camera for instance. Such a camera 37 can provide pictures to the computer 30 using any appropriate communication protocol, such as serial communication protocols or wireless protocols for example.
  • the portable computer 30 may comprise a general purpose or special purpose computer including a programmable controller or processor 31, such as a processor commercially available from Intel Corp. or Advanced Micro Devices, Inc. for example, and, a non- volatile memory device 33 for storing programs having instructions, such as on a hard disk drive, a CD ROM drive, a floppy disk drive, a flash memory unit, and/or a ROM chip.
  • the computer 30 may also include volatile memory for use in executing programmed instructions, such as RAM or DRAM chips for instance, as well as appropriate circuitry for interfacing with the volatile and nonvolatile memory.
  • Communication ports 34 and related circuitry may also be provided to interface the computer 30 with various devices, including input/output devices, such as keyboards, microphones, and printers for instance, as well as other computer devices via wired or wireless network or modem communications.
  • the portable computer 30 can comprise a computer provided with its own power source such as an onboard battery to allow the unit to be comfortably carried by a user (e.g., an auditor) for extended periods of time, such as while walking through the facility of a business party and conducting an assessment of the party's procedures, operations, organization, equipment, and or facilities.
  • the computer 30 comprises a tablet computer or pen computer, such as, for instance, the STYLISTIC or PENCENTRA models offered by Fujitsu Personal Systems, Inc. Such computers offer processing power, light weight, and long battery life.
  • any of a variety of computers could be utilized with the methods and programs of the present invention, and the computers described herein may include laptop computers, notebook computers, desktop computers, portable data collectors, input stations, personal digital assistants (PDA's), portable electronic devices, Internet appliances, or other data input and/or display devices.
  • the computer 30 may allow the user to easily view assessment requirements, enter ratings and comments, and prepare assessment reports, and, as will be understood, can take any of a virtually unlimited number of alternative forms.
  • Standard operating system software can be utilized with the computer 30, if desired, such as WINDOWS operating system software for instance.
  • An assessment assistant software or firmware program described in further detail below, can be stored in memory 33 and run on the computer 30 to allow for inputting and reporting assessments.
  • FIG. 2 is a flow diagram illustrating an exemplary series of steps according to which such software or firmware associated with the computer may operate.
  • FIGS. 3-11 illustrate user interfaces, such as computer screen images for instance, which can utilize windows, frames, pages, icons, toolbars, menus, and/or text, in order to display and organize assessment requirements for the user, to receive assessment input from the user, and to automatically prepare assessment reports from the inputs.
  • the user can begin an assessment by executing the stored assessment software program, as shown at block 110 of FIG. 2.
  • This action can consist of selecting the appropriate program from a list or from a group of icons using an input device, or otherwise providing a predetermined input to begin execution of the program.
  • the program may be written in any of a variety of languages suitable for creating software or firmware programs. For instance, languages such as C++, HTML, and/or Visual Basic may be utilized.
  • the user may make a selection to create a new assessment file or to edit an existing assessment file, such as by using the input device to select a button or to select. from a menu.
  • a setup screen can be displayed requesting that the user provide particular setup (e.g., characteristic) inputs regarding that new file, as shown at block 112 of FIG. 2.
  • FIG. 3 illustrates an exemplary setup window 200 that may be utilized for this purpose.
  • the name of the auditor (assessor) can be requested by the program and supplied by the user in text box 202 (or automatically supplied by default on a particular user's device)
  • the company or business party being assessed can be requested by the program and supplied by the user in text box 204
  • the location of the company can be requested by the program and supplied by the user in text box 206
  • the scope or type of assessment can be requested by the program and supplied by the user in text box 208.
  • Such text boxes or input boxes can be created in any of a variety of manners, depending on the programming language utilized. For example, commands are provided in Visual Basic to present suitable text or input boxes.
  • a check list box 210 can be provided to allow the user to select whether the assessment is to be a "standard,” “advanced,” or “enhanced” assessment, and to thereby control the depth and/or number of assessment requirements which are utilized during the assessment.
  • Any of a variety of data input boxes can be utilized to allow the user to select one or more of these assessment types, such as radio buttons, pick list or checklist boxes, and the like.
  • Various suitable textboxes can be created using any of a variety of programming languages, including the Visual Basic language. It is contemplated that the invention might also include computer implemented instructions that would allow a user to customize the user screen, checklists and/or data entry steps for recording performance assessments during particular audits.
  • list box 212 can be utilized to select the objectives (or focus) of the assessment, which can then be utilized by the program to narrow the listing of assessment requirements which will be presented to the user, and provide a more focused, tailored assessment for any particular application.
  • the list includes issues related to development capability, data integrity, control, contamination, and material control.
  • "Material control" relates to the business party's ability to identify and control specific lots of off-quality material from the supply chain, as well as the ability to identify the processing conditions (materials, facility, equipment, operators, settings, etc) that produced specific lots of materials.
  • Contamination relates to things not belonging in or on a material, and or minimum required purity levels.
  • control objective relates to the ability of the business party to understand and maintain qualified processing conditions, (e.g. Process Control, Quality Control, Change Control, Incoming Material Control, etc), the party's ability to validate to ensure that intended outcomes (or processes, systems, test methods, etc) are achieved, and the party's ability to ensure that suppliers are using the approved specifications.
  • Data integrity relates to the party's use of testing data to ensure only acceptable materials or products are released to the company.
  • Development capability relates to the party's ability to innovate in terms of new products/materials, processes, cost savings, solutions to problems, and the like.
  • the list box 214 allows the user to select whether requirements of the ISO (the International Organization for Standardization) will be excluded during the assessment, such as the requirements needed to be fulfilled in order to be certified by the ISO. If the party to be assessed is already ISO certified, then there would be no need to assess the party with respect to requirements overlapping or similar to those of the ISO. Likewise, a selection can also be provided in box 214 to allow the user to select whether the GMP (Good Manufacturing Practices) requirements will be excluded during the assessment.
  • the ISO the International Organization for Standardization
  • performance assessment inputs are supplied such as contemplated in this illustrative example in the setup screen, and, at block 116, they are utilized to select and display the relevant assessment categories. For example, if the user selected that "standard”, “advanced”, and “enhanced” requirements be utilized, a more lengthy set of categories and associated requirements would be selected and displayed than had the user selected only the "standard” requirements. Likewise, particular categories and requirements would be included depending on the items selected in the "objective” input box of the setup screen. Moreover, if the user selected ISO and/or GMP requirements, then any requirements similar to those used by those organizations will not be selected and displayed to the user.
  • FIG. 17 illustrates an exemplary relational database structure 90 which might be utilized for automatically determining which requirements to include in a given assessment, based upon the user's setup inputs, such as type, objectives, etc.
  • the assessment requirements are arranged in a database system having records 250 (the assessment requirements) and various fields F associated with these requirements (whether the requirement is standard, advanced, or enhanced; the objectives to which the requirement relates; whether the requirement is related to ISO or GMP, etc.)
  • Which fields apply to a given requirement can be provided in any number of suitable manners, such as by appending a code to the record, selecting a bit, etc.
  • this database 90 is in the form of a tree structure.
  • each requirement 250 is related to a category 222, or key element, such as a starting materials category or a packing operations category, depending on the business area to which the requirement relates.
  • requirements may be grouped depending on the business area or business function to which they are related.
  • each requirement might also be related to one or more subcategories 230 which fall under the main category.
  • identifiers such as codes, numerals, data, and/or letters for instance, could be stored in the database. These identifiers can be matched with the corresponding category or subcategory description for display or reporting purposes.
  • the common fields F allow the computer processor to quickly determine which requirements 250 relate to the selections provided on the start up screen. For instance, in the example of FIG. 17, had the user wished only to utilize ISO requirements during the assessment by selecting the appropriate item in the setup screen, the computer processor would select only Requirements 1.1.1, 1.1.2, and 2.1.1.2 from the universe of requirements in the database 90.
  • the program displays the categories which are relevant to those requirements which have been selected based upon the user's input.
  • certain requirements are automatically selected by comparing the user's inputs provided in the setup screen with fields in the database.
  • the various categories to which these requirements belong are displayed. This is illustrated in the category window 220 of FIG. 4.
  • the categories 222 related to the selected requirements are displayed in a list window 224.
  • a scroll bar 226 can be provided to allow the user to scroll through this list.
  • Identification information can also be displayed along with the list window 224, such as the business party name 204, the business party location 206, the audit scope/objective 208, the audit type 209 selected by the user in the setup window. Images 228 may also be displayed next to text which is displayed.
  • the user may select, such as by moving a pointer or cursor with an input device, any of the categories 222 in the list 224 window. This is shown in the exemplary flow diagram of FIG. 2. As shown in block 118 of FIG. 2, this selection is received by the computer. In response to the selection, any subcategories associated with the main category selected are displayed, as shown at block 120.
  • the requirements under the main category area can be sub-divided under subcategories.
  • certain assessment requirements in one application might fall under a main category entitled House Keeping/Pest Control/Maintenance.
  • the requirements may be further divided into groups. For instance, some of the requirements may relate to "general principles" of the main category, others may be specific to "housekeeping”, others may be specific to "pest control”, and still others may be specific to "facility maintenance".
  • the exact divisions of categories and sub-categories will depend upon the type of requirements which will be utilized in conducting assessments for an application, and can vary widely depending on the businesses that are assessed, the main business of the assessing company, and/or the level of quality desired.
  • FIG. 5 illustrates a subcategory window 229, which displays this hierarchical relationship in response to a selection of an assessment category by the user.
  • the user selected the category "Housekeeping, Pest Control, & Maintenance" from the category window 220 of FIG. 4.
  • the computer displays a subcategory list 231 which shows the sub-categories 230 which fall under the selected category 222'.
  • Such a hierarchical arrangement of the assessment requirement data can be achieved using appropriate display commands. For example, if Visual Basic is utilized, suitable commands are available to generate such nodes in a hierarchical arrangement:
  • a description window 240 is also provided within the sub-category window 229.
  • This description window 240 displays information regarding the categories, sub-categories, and/or assessment requirements.
  • a category description 242 is provided which describes the category 222' which was selected and/or the principle or standard to which the business party will be held with respect to that selected category.
  • the subcategories 230 are listed, as well as any sub-subcategories 232 falling under each subcategory.
  • all of the requirements 250 falling under the selected category 222' are displayed within the Window 229.
  • the category selected 222' can be given an overall numerical rating by the user, as well as a textual commentary observation.
  • category rating buttons 234 are also displayed within window 229 in response to a category 222' being selected. These buttons 234 can be utilized in selecting an overall rating for that assessment category 222'.
  • the ratings are numerical ratings of 2, 4, 6, 8, and 10, corresponding to unsatisfactory, poor, fair, good, and excellent performance in the selected category 222'.
  • a button 234 can also be provided to indicate that the category 222' is still being rated, and another button 234 (N/A button) can be provided to indicate that the category is not applicable (e.g., it will not be rated).
  • one or more comment entry boxes 236 can be provided to allow the user to input text describing the performance of the assessed business party in the selected category 222'. While category ratings and comments can be entered at this point using the buttons 234 and box 236, it is recommended that the user wait until the requirements 250 falling under the category 222' have been evaluated before making an overall evaluation of the category. Moreover, as described below, the program can automatically determine a suggested category rating based upon the ratings given for the requirements 250 under the category 222'. This suggested rating can assist the user in making an overall rating for the category 222'.
  • the user can also select a sub-category in the window 224 of FIG. 5 in order to view a list of any sub-subcategories which are related to the selected sub-category.
  • the related sub-subcategories 232 are displayed, as shown in window 233 of FIG. 6.
  • the selected subcategory 230' is "General Principles”
  • the sub-subcategories 232 falling under this are "Owners Identified and Records Maintained”.
  • the information displayed in the window 240 is limited to the selected subcategory 230', as well as the related sub-subcategories 232 and the related requirements 250 falling under the selected subcategory 230'.
  • This process of selecting assessment subcategories 230 and displaying the related sub-subcategories 232 is shown in blocks 122 and 124 of FIG. 2.
  • the user's selection from the input device is received at block 122, and the sub-subcategories are displayed at block 124, such as in the hierarchical tree format of FIG. 6, which allows the user to view all of the sub-subcategories 232, categories 222, and requirements 250 related to the selected subcategory 230'. Accordingly, the user can quickly view the relationship between the requirements 250 and the various groupings (i.e., categories 222 and subcategories 230).
  • the assessment requirements 250 may be divided into fewer, less, or no groupings as desired.
  • the particular hierarchical arrangement chosen can depend on the number and type of requirements 250 that will be utilized with the automated assessment assistance system described herein. As noted above, the uses of this system can vary widely among companies, businesses, products, and services, and, accordingly, the assessment requirements 250 needed or desired can vary widely.
  • a first selection such as a single click
  • a second selection such as a double click for example, provides this same display in window 240 but also expands the tree format to display all related subcategories in window 224, as shown in FIG. 5.
  • a first selection of a subcategory 230 will display all related sub-subcategories 232 and requirements 250 in window 240, while a second selection provides this same display in window 240 but also expands the tree format to display all related sub-subcategories in window 224, as shown in FIG. 5. Accordingly, the user can choose whether to "drill down" or expand the tree format of window 224.
  • Other input mechanisms could be provided for this purpose. For example, view levels could be provided to allow the user to expand or limit the detail provided regarding the hierarchical arrangement of the categories 222 and related subcategories 230/232 shown in the window 224. Buttons, menus, icons, voice commands, and the like can be utilized to allow the user to select the desired detail level. In the window 240 of FIG. 5 or FIG. 6, the user may select any of the requirements
  • the window 240 will expand to view the full description of the selected requirement 250', as well as the related subcategory 230' and sub-subcategories 232' of that requirement. In this way, an auditor or other user can verify that the proper or desired requirement is being viewed.
  • requirement rating/criticality buttons 252 are also displayed to allow the user to rate the business party's performance with respect to that requirement.
  • buttons 252 are labeled "pending” for when the requirement 250' is still pending a rating by the user (this can be used as the default button), "N/A” for when the requirement is not applicable to the party, "excellent” for excellent performance with respect to the requirement, “acceptable” for acceptable performance with respect to the requirement, “minor” for minor problems with the performance expected (as described by the requirement description), “major” for major problems with the performance expected, and “critical” for critical problems with the performance expected.
  • a text input box 254 can be used to allow the user to input one or more text comments relating to the business party's performance with respect to the selected requirement 250', such as any issues, problems, observations/commendations, improvements relating thereto.
  • voice recognition technology is utilized in entering such comments, for efficiency in conducting the assessment.
  • notes can be transcribed using voice recognition after exiting the noisy area.
  • text box 254 For requirements rated with a "major” or “critical” rating, text should be provided in text box 254 to clarify exactly what issues or problems were observed for the requirement. Accordingly, during an assessment, assessors can compare the business party's operations and facilities to the defined quality requirements, and using the text input box 254, the assessor can note deviations from or satisfaction of the requirements via detailed descriptions.
  • the text input box 254 can be used to record issues or problems observed, as well as to explain any risks due to issues/problems.
  • the criticality buttons 252 can be used to identify the concern or significance of the issues observed.
  • the requirement selection such as a click or touch of the requirement
  • the full requirement description is displayed, as well as the rating buttons and input boxes related to that requirement, as shown at block 128.
  • the program then receives the rating which is made for that requirement, along with any comments that the user might have with respect to that requirement, as shown at block 130.
  • These ratings and comments can be stored in a database, along with an identifier of the requirement being rating.
  • the selected rating button 252' is the "major” button, indicating "major” issues with respect to the requirement 250'.
  • the computer can display a rating icon 256 corresponding to the selected rating 252'.
  • a dot of a particular color is utilized to indicate the rating with respect to that requirement 252'.
  • other images, icons, text, numerals, letters, indicia, and the like could be utilized and displayed next to a requirement 250' to indicate the current rating given by the auditor to that requirement.
  • the text 257 is input by the auditor using an input device, which may comprise a voice recognition device for maximum convenience, to indicate any comments that the auditor may have regarding the business party's performance with respect to the requirement 250'. For example, it may be desirable to provide additional detail regarding how the party performed with respect to the requirement 250', how the party could improve with respect to the requirement, or to otherwise take notes or memorialize observations with respect to the requirement.
  • an input device which may comprise a voice recognition device for maximum convenience
  • buttons 252 and text box 254 to provide the rating.
  • other buttons, boxes, windows, icons, lists, selectable items, and the like could be utilized to assist the auditor in rating a requirement 250.
  • Such an automated process does not require the auditor to remember the categories 222, sub-categories 230-232, and requirements 250, and can automatically select the pertinent requirements based upon the type of assessment desired, and other input provided in the setup screen.
  • the auditor can select another category 230 or subcategory 232 in the window 224 of FIG. 8. For example, the auditor could select the current category 230'. Such a selection will return the user to the viewing screen 233, as shown in FIG. 9. It should be noted that this is the same view as in FIG. 6, except that the ratings icons 256 show the ratings which the user provided for the requirements 250. Another difference is that the progress percentages 258, 260, 262, and 264 (which are displayed next to the assessment type 208, the categories 222, the subcategories 230, and the sub-subcategories 232 respectively) have been updated to reflect the requirements 250 which have been rated.
  • the computer calculates the percentage of the total requirements which have been rated and displays this as a total progress percentage 258. Likewise, the computer calculates the percentage of the total requirements of a category 222 which have been rated and displays this as a category progress percentage 260. Similarly, the percentage of the subcategory requirements which have been rated are displayed as a subcategory progress percentage 262, and the percentage of sub-subcategory requirements which have been rated are displays as a sub-subcategory progress percentage 264. Accordingly, in addition to providing the hierarchical arrangement of the various requirements, the window 224 also provides the user with continual updates of his or her progress.
  • a suggested rating is calculated and displayed for each category 222. This suggested rating is shown with reference numeral 266 in FIG. 9, and is based upon the ratings given by the user for the various requirements within that category.
  • a formula, algorithm, or calculation, such as an averaging calculation for example, can be utilized to provide these suggested category ratings 266.
  • the following algorithm/process is utilized: - When a category has 1 or more "Excellent” ratings and no “Minor", “Major,” or “Critical” ratings for the requirements within it, it receives a 10.
  • such an algorithm can be programmed to allow for the automatic calculation of a suggested category rating.
  • the algorithm could be programmed as one or more condition statements, such as "IF-THEN" statements, to automatically determine a suggested category rating based upon the ratings given to the requirements within the category.
  • the user may override the suggested rating, although override capability need not be permitted if it is desirable to force the user to utilize the automatically determined category rating.
  • blocks 132 and 134 illustrate these progress update and rating suggestion steps of this exemplary method.
  • the ratings for rated requirements are displayed, such as, for example, next to the requirement description or other identification code for the requirement.
  • progress percentages are displayed at block 132, to inform the user as to how much of the assessment has been completed and how much still remains to be completed.
  • the suggested category rating is calculated from all of the rated requirements within the category, and this suggested rating is displayed for use by the auditor.
  • the suggested ratings 266 of FIG. 9 can be utilized by the auditor in providing overall ratings for each category 222.
  • the auditor can select a category 222, such as by clicking on or otherwise pointing to the desired category.
  • the screen 229 is again displayed, as shown in FIG. 10.
  • the suggested rating 260' for the selected category 222' has changed from the default (in this case 8) to a value based upon the recently entered ratings for the requirements 250 falling within that category 222'.
  • the user may utilize this suggested rating, or, altematively, the user may provide another rating using the category rating buttons 234.
  • these buttons 234 allow the assessor to assign a quantitative rating on a scale of 1-10 which corresponds with the business party's degree of implementation and effectiveness relative to the principled intent of the category 222.
  • a rating of 10 could be used when excellent implementation and excellent effectiveness of the category standard/description 242 have been observed, a rating of 8 could be used when full implementation and full effectiveness of the category standard 242 have been observed, a rating of 6 could be assigned when satisfactory implementation and partial effectiveness of the category standard 242 have been observed, a rating of 4 could be assigned when partial implementation and partial effectiveness of the category standard 242 have been observed, a rating of 2 could be used when some attempt at implementation but no effectiveness of the category standard 242 have been observed, and a rating of 0 could be appropriate when no implementation and no effectiveness of the category standard 242 have been observed.
  • the category rating button 234 should be assigned after the requirements within the category 222 have been adequately assessed.
  • the user may provide comments in text box 236 to describe or summarize the overall performance of the business party with respect to the selected category 222'. These category ratings and comments can then be saved in a database, with an indication of the category to which they apply.
  • step 136 the user decides whether to enter more ratings for requirements within a category. If more ratings are to be entered for the requirements within the category, the process continues to block 126, and the user can select and rate the desired requirements, as described above. However, if no more ratings are to be entered, the user can select the category in order to provide an overall category rating.
  • a category selection is provided from the user, and, in response to this selection, category ratings buttons and a category comment box are displayed. Accordingly, the user can input a rating for the category as well as comments regarding the performance observed for the category. Once received, these inputs are saved, as shown at block 142.
  • a status screen 280 can be displayed when the user makes a particular selection, such as from a menu, button, or icon.
  • the status screen 280 can illustrate how far along the assessment is, as well as give a summary of the ratings given so far.
  • the screen 280 can list the number of requirements rated so far, out of the total requirements to be rated, as shown by reference numeral 282.
  • the categories rated so far out of the total categories to be rated can also be displayed, as shown by reference numeral 284.
  • Other progress numbers can also be displayed, such as the percentage 286 of requirements rated, as well as the number 288 of requirements rated.
  • a running total of the ratings can also be displayed, such as the number of not applicable ratings, the number of excellent ratings, the number of acceptable ratings, the number of minor ratings, the number of major ratings, and the number of critical ratings, as shown by reference numerals 290-295.
  • a capability rating 298 can also be provided based upon the requirements rated so far, in order to provide a suggested overall score for the entire assessment.
  • the score can then be displayed along with the type/depth of assessment conducted (e.g., standard vs. advanced).
  • the exemplary embodiment of the present invention can also automatically assist in the preparation of assessment reports.
  • the assessing computer 322 can transfer and share the collected assessment data with one or more other computers 324 and 326, and/or with one or more servers 320.
  • the connection 321 between these devices can be a network connection, such as via Internet, Intranet, and/or Ethernet, or can be a wired or wireless peer-to-peer connection, such as via modem or other communication device.
  • the assessing computer 322 includes a processor 334 which executes assessment software in memory 336.
  • the assessment software can operate as described above with respect to FIGS. 1-11.
  • the assessment input data saved using such software can be automatically placed into a report format and uploaded to server 320, which utilizes a processor 330 to store the data in central database 332.
  • the connection 321 can allow access to this data by other computing devices 324, 326, such as through an Internet or Intranet connection. Accordingly, an assessment data recorded by one computer 322 can be quickly and easily accessed, analyzed, summarized, and viewed by other computers.
  • FIG. 13 provides an example of browser software which may be utilized by the computers 322, 324, and 326 to access the assessment data stored in the central database 332.
  • the browser software is the NAVIGATOR browser made by Netscape, although other browser or viewing software could be utilized, such as the EXPLORER browser made by Microsoft for instance.
  • the browser window 340 provided by such software can list and allow access to the assessment input data and/or reports prepared by various auditors with respect to various business parties.
  • each assessment report is listed by business party (e.g., supplier) name 344, city 346, country 348, date of the assessment 349, and assessment objective or scope 350. The user can select one or more of these fields to then access the actual assessment report displaying and/or summarizing the assessment data collected.
  • a separate or additional icon or link 352 can be provided to allow selectable access to the report.
  • the central database 332 of report data quick and efficient access to assessment reports is provided, even from remote locations.
  • the central database or repository 332 of assessment reports and/or data can be utilized to easily determine when to assess suppliers, which supplier sites to assess, to what depth (e.g., standard vs. advanced) to assess suppliers, and how often to re-assess suppliers. This can be determined, automatically or manually, based upon the length of time since the last assessment, the ratings given in previous audits, the number of critical requirements found, the number of audits conducted on the party, the number of critical requirements observed, the importance of the party to the company's product/services, the experience of the party with the company, recent events which might indicate a need for an interim assessment, etc.
  • Appropriate security measures can be taken to restrict read and/or write access to the assessment data in the central database 332, and/or to otherwise control the information which can be assessed by a remote user.
  • the assessment collection software described above can also allow for exporting of the data collected to a compilation database, separate from or integral with database 332, which allows for all data from all assessments to be searched, summarized, and otherwise analyzed simultaneously. This can be achieved, for example, by adding assessment data to the compilation database upon completion of an assessment, or by concatenating data from all assessment reports generated by the auditors.
  • Such a compilation database can be used for trend analysis to identify requirements which a party or multiple parties are frequently having problems fulfilling, and/or to quickly locate the most problematic/critical requirements.
  • FIGS. 14-16 illustrate exemplary reports 358 that can be automatically created by the assessment software described herein, or by separate software, from the assessment input provided by the user, such as described above.
  • the assessed parties name 360 and location 362, the auditor's name 364, the assessment scope 366, and an internal assessment name or code 368 are displayed.
  • Assessment data 369 is also displayed, in table format in this exemplary report.
  • the assessment date 370 is displayed as well as the days elapsed 372 since the assessment.
  • the report 358 displays the total number 374 of records in the database, the number 376 of requirements assessed, the percentage 378 of the total applicable requirements that have been assessed, as well as the total requirements 380 which remain unassessed (e.g., which are "pending").
  • Data regarding the ratings given to the assessed requirements can also be displayed, such as the total number 382 of requirements rating "inapplicable”, the total number 384 of requirements rated “excellent”, the total number 386 of requirements rated “acceptable”, the total number 388 of requirements rated “minor”, the total number 390 of requirements rated “major”, and the total number 392 of requirements rated “critical”.
  • These requirement ratings can be recorded using the requirement rating buttons 252 described above, or a similar input mechanism and user interface.
  • Such data can be displayed with graphs or charts, if desired.
  • pie chart 394 is displayed in report 358, with a piece 391 indicating “minor” ratings, a piece 393 indicating “major” ratings, a piece 395 indicating “critical” ratings, a piece 396 indicating “pending” ratings, a piece 397 indicating “critical” ratings, a piece 398 indicating "excellent” ratings, and a piece 399 indicating "acceptable” ratings.
  • Other methods or views for display of data could be utilized in addition to or as alternatives to those of exemplary report 358.
  • the report 358 is in table format and shows the ratings which were given for each of the categories 222.
  • identification data including the name 360 and location 362 of the assessed party, the name 364 of the auditor who assessed the party, the date 370 of the assessment, and an identification code 371 for the assessed party.
  • categories 222 i.e., category descriptions
  • categories 222 i.e., category descriptions
  • These ratings 235 can be entered and recorded using the category rating buttons 234 described above, or a similar user interface input device.
  • one or more comments 237 are displayed for each category 222. These comments 237 can be entered using the category comment entry box 236 described above, or similar user input mechanism.
  • an overall rating is provided for the assessment.
  • the overall rating is the capability rating 298 which is automatically calculated by the computer, such as via the equation 299.
  • FIGS. 16a and 16b illustrate another assessment report format that may be utilized.
  • Identification data 360, 362, 364, and 370 are displayed at the top of the report 358.
  • the body of the report includes a table 359 which lists each of the relevant categories 222, as well as the rating 235 applied to each category. In this example, columns are provided for each possible rating, and X's or other indicators are placed in the appropriate column to indicate the rating given for the category 222.
  • the comment 237 saved for each category is also displayed.
  • the ratings 235 can be assigned using the rating buttons 234 described above, and the comments 237 are assigned using the input boxes 236 described above.
  • the reports 358 described above can be automatically generated by a report generation program, which may be part of the assessment entry software described above with respect to FIGS. 1-11 (such as program 336 of FIG. 12), or which may be provided as a separate program (such as program 332 illustrated in FIG. 12.)
  • the reports 358 may be generated automatically by accessing the assessment inputs stored during the conducting of the assessment. Rather than saving the category description 222 or requirement descriptions 250 with this data, identifier codes can be saved. Then, when the report is generated, the codes can be matched up with the corresponding descriptions for display in the report 358, if described. For each identifier, the rating given by the auditor is also stored, as well as the auditor's comment(s). Accordingly, the reports 358 can be easily and automatically generated once the assessment ratings have been collected.
  • FIG. 19 illustrates an additional exemplary report that may be automatically generated using an assessment entry program made in accordance with principles of the present invention.
  • the report includes an assessment summary portion 401 which provides a summary of the assessment conducted.
  • various identification information can be presented, such as a the business party's identification information 402, the business party's contact information 404, and the assessor identification information 426. Some or all of this information can be collected using the assessment collection software, such as by using the setup information screen of FIG. 3, or the like.
  • Additional information can also be collected regarding the assessed party, such as, for example, a description 406 of the assessment purpose, a description 408 of the facilities, operation, or warehouse assessed, a description 410 of the business party assessed, and a description 412 of other external audits, certification, or associations of the business party.
  • This information can likewise be entered using a screen similar to FIG. 3, and automatically saved with the assessment inputs.
  • the assessor can be requested to provide this information upon generation of the report.
  • Summary information regarding the assessment can also be provided in the summary section 401.
  • a summary 414 of the assessment findings can be provided, as well as a summary 424 of the actions which will be taken by the business party in response to the assessment.
  • Such summary data can be collected by the assessment entry program upon the completion of the assessment, or can be requested upon generation of the report.
  • the assessment data taken during the assessment can also be summarized and displayed.
  • the "Capability" scores 416 for the current and previous assessment can be displayed, as can the number 418 of requirements rated "critical", the number 420 of requirements rated major, and the number 422 of requirements rated "minor".
  • Such summary data can be automatically calculated from the assessment data collected as described above.
  • the exemplary report includes a category rating section 403, as shown in the example of FIG. 19b.
  • the identification information 402 is repeated at the top of the section 403, and the capability score 417 for the assessment is displayed, along with the calculation 428 used to generate the score.
  • table 429 the pertinent categories 222 which were assessed are listed, along with the ratings 235 given during the assessment for those categories, as well as any summary comments 237 provided for each category.
  • a rating key 430 can be displayed to indicate the meaning of the various ratings on the scale.
  • the exemplary report includes a requirement rating section 405 which lists or summarizes the requirements which were not satisfied and/or for which issues or problems were observed.
  • the identification information 402 is repeated at the top of the section 405.
  • the categories 222 which had problematic requirements are displayed. Under each category 222 is displayed the standard principal 242 of that category, as well as a system description 432, summary comments 434, and general observations 441 relating to the corresponding category. This information can be collected using text entry boxes, such as the box 236 of FIG. 10 or the like, during or after the assessment.
  • the specific observations 257 collected during the assessment (such as by using the screen of FIG. 8 with the box 254) regarding the problematic requirements are displayed.
  • the user can edit the setup information regarding the assessment during the assessment, to further include or exclude requirements during the assessment.
  • a screen can be automatically displayed upon the selection of an icon or menu, such as the "tools" menu 451 of FIG. 9.
  • the screen 450 allows the user to add or modify the various identification information 202, 204, 206, and 208.
  • the setup information utilized to automatically select the pertinent requirements can be modified as well. For example, the user can change the assessment type/focus using list 210, the assessment objectives using the list 212, and the external standards or certifications using the list 214. Once modified, the information can be saved and the pertinent requirements updated by selecting the "save" button 217.
  • FIG. 21 illustrates a ratings guide 460 that can be displayed during the assessment, such as by using an icon or menu, such as a "help" menu.
  • the ratings guide 460 can assist the assessor in assigning one of the ratings 235 for the category 222 which is currently selected.
  • each subcategory 230 related to the selected category 222 can be displayed.
  • Under each subcategory is displayed an explanation 462 of when the corresponding rating 235 in the table is applicable to that subcategory (i.e., what performance merits that rating.)
  • the assessor can better determine what overall numerical rating or grade 235 to assign to a particular category.
  • FIG. 18 is a process diagram illustrating the consolidation of requirements into a central database, the selection of relevant requirements from that database, the rating of the relevant requirements, and the generation of reports therefrom.
  • a number of different requirements can be utilized for various assessment situations 270. For example, audits focused on contamination issues would use a particular set 270 of requirements.
  • These requirements can be combined into a single requirement database 272.
  • Each requirement 273 can be saved in the database 272 along with a identifier (e.g., code, number, bit, etc.) 274 of the categories and sub-categories to which it is related. Also, each requirement 273 can be saved with an identifier 276 of the situations in which it is to be used.
  • a certain requirement 273 may be useful when contamination is a focus or objective of the assessment. Moreover, that requirement may be detailed and therefore applicable for advanced type audits, as opposed to standard, or less detailed type audits.
  • the central storage of the requirements in database 272 also allows for ease of modifications of requirements and/or categories as desired and needed.
  • the user Upon initiating an assessment, and/or during the assessment, the user provides setup inputs regarding the type, objectives, scope, focus, and/or depth of the assessment to be conducted. Based upon these inputs, a subset 278 of the relevant requirements is automatically selected and/or saved, and then displayed to the user by selecting these requirements having identifiers 276 corresponding to the setup input. The user then enters ratings and/or observations with respect to the various requirements. In addition, ratings and observations can be made with respect to the various categories of the various requirements in the subset 278, as well as an overall rating for the assessment. The ratings and observations can be stored, along with the requirements to which they relate.
  • a report 281 can be automatically generated showing the categories and requirements and the ratings/observations regarding each, such as by matching identifiers for requirements and categories with descriptions relating thereto.
  • the data can be automatically summarized, sorted, analyzed, and formatted as desired in the report 281.
  • Each report 281 from each assessment can then be stored in a central report repository 285, such as in electronic format, which can be easily accessed by users, such as by using electronic devices in various remote locations.
  • the data can be exported to a central data repository 283 where it is combined with data from other audits.
  • This data repository 283 can be accessed by users for electronically sorting, searching, summarizing and analyzing the data of all assessments, as well as identifying critical issues and trends.
  • data from the previous report 281 of that party can be automatically inserted into the data entry fields of that subsequent assessment entry file 207 as default values.
  • the user can either select that previous assessment data from the report repository 285 and save it as a new assessment file, or the user can select a party to audit and, in response to that selection, the previous data for that party is automatically selected from the repository 285 or 283 and inserted into the data entry fields.
  • the text observations 257 can be automatically inserted and the rating buttons 252 can be automatically selected for the various requirements.
  • the user can then know the previous comments and ratings for the party with respect to the various requirements, but is free to change these as appropriate during the subsequent assessment.
  • Pick lists and criticality lists can also be provided and displayed during the execution of the subsequent assessment data entry program to identify areas which were considered problematic during the previous assessment, to ensure that the assessor addresses these areas during the subsequent assessment.
  • data from the previous assessment is provided in the subsequent assessment data entry screens 287.
  • default observations 289 from the previous assessment are provided in the data entry screen 287, and a list 291 of issues from the previous assessment is displayed. (For instance, a list of requirements rated "Critical" during the previous assessment could be displayed.)
  • the embodiments disclosed above can provide efficiency and objectivity to the auditing process. Relevant requirements can be selected and displayed quickly and efficiently, so that the auditor utilizes the correct requirement descriptions.
  • ratings and comments with respect to the various requirements can be entered easily by the user (e.g., using a portable computer and voice recognition) to make the assessment process more efficient.
  • suggested category ratings and overall assessment ratings can be automatically generated as well, to assist the user in providing ratings for the various broad categories and for the assessment as a whole, and reports and statistical analysis can be automatically generated at the conclusion of the assessment.
  • a central repository of assessment reports provides wide access to assessments, and the data from all assessments can be combined in a common database to allow for automatic creation of summaries and reports and to identify trends.
  • the use of a range of possible ratings or criticality factors can allow the user to assess the requirement according to predicted impact on the company's business, thereby reducing the amount of subjectivity in the assessment
  • a portable digital device could contact a central computer which includes the assessment input and display screens.
  • the relevant assessment input requirements can be downloaded from the central computer to the digital device, in parts or as a complete set. Inputs can be provided on the digital device to rate the assessment requirements and this information can be uploaded to the central computer and/or saved on the portable device.

Abstract

A method and system for conducting assessments of business parties and generating data therefrom. According to one embodiment, the user provides assessment characteristic inputs regarding assessment type, depth, and/or objectives to a computer program which, in response to the inputs, selects and displays pertinent requirements to be utilized during the assessment. Screens (32) are provided to display the requirements, and the categories and subcategories related thereto, in a hierarchical fashion. The description or standard for each requirement is displayed and input boxes and/or buttons (35) are provided to allow the user to enter ratings and/or comments or observations regarding each requirement. Such ratings and comments can also be provided for each overall category, and suggested ratings can be automatically calculated based upon the ratings given to the requirements within the categories. For convenience, a tablet PC or other handheld computer (30) can be utilized to execute the program, and voice recognition technology can be utilized to provide the comments. Once the assessment is completed, reports can be uploaded to a central repository which can be accessed by a plurality of users. Morever, the data from the assessment can be added to a compilation database, and summarizing, sorting, and other analyzing of the compilation database can be automatically conducted in order to identify critical issues, trends, improvements, and the like.

Description

METHOD AND SYSTEM FOR FACILITATJ-NG ASSESSMENTS OF BUSINESS PARTIES
COPYRIGHTS PRESERVED
A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
TECHNICAL FIELD
The present invention relates generally to a method and system for facilitating and standardizing assessments of business parties, and more specifically, in one embodiment, to the display of a hierarchical arrangement of assessment requirements and categories, and to the electronic entry, storage, analysis and reporting of ratings and comments regarding the requirements and categories.
BACKGROUND OF THE INVENTION In the course of business, a company typically establishes relationships with a number of business parties for supply of goods and/or services needed in carrying out the company business. For example, a company that manufactures a food product might receive various ingredients for making the food product from a variety of suppliers. Companies have routinely assessed or audited such business partners in order to ensure the quality of the ultimate products produced by the company. For instance, if a supplier's food ingredient is contaminated, this may result in contamination of the ultimate food product produced by the company. Accordingly, it is desirable to monitor and/or inspect these business parties as appropriate to ensure that the company's performance requirements (which can include expectations, standards, codes, principles, regulations, procedures, goals, practices, criteria, ideals, inspection items, and/or questions) are being met. These requirements can fall under various categories (e.g., areas or topics), and the categories which are assessed can vary depending on the business party and the depth and objectives of the assessment. Such requirements can include performance standards, goals, and/or expectations with respect to business practices and procedures, manufacturing operations, product quality, facilities and equipment, personnel and management, and other product, process, and service areas. Business parties, including suppliers, vendors, service providers, partners, contract manufacturers, and other associated entities with which a business may be involved, may be audited by the company, and the categories and requirements utilized during the audit may differ for each such party. Such assessments or audits can require extensive textual descriptions of the various requirements (e.g., standards or expectations) which are to be evaluated. This dependence on text can create problems in understanding and remembering the various requirements. Accordingly, some auditors (assessors) may utilize written descriptions of the assessed categories and requirements, to assist in remembering the categories to be assessed and the requirements under each category. However, transporting, organizing, searching, accessing, and referencing such written descriptions during an assessment can be inconvenient and time consuming. Moreover, while checklists and the like can be utilized to help memorialize the categories to be assessed, sufficient detail is typically not provided in such checklists to ensure that all pertinent requirements are met. In addition, capability is usually not provided to quickly view only pertinent requirements for the particular party being evaluated, and safeguards are not provided to ensure that all pertinent assessment categories and requirements are evaluated and that the auditor does not overlook some of these pertinent categories and requirements.
Because of the inconvenience of physically transporting voluminous paper descriptions of the assessed requirements, and/or the categorizes and sub-categories within which they fall, some auditors choose to attempt to remember all of the pertinent assessment requirements and categories. Clearly, such a system can result in certain assessment categories and requirements being left insufficiently evaluated or completely unevaluated. Also, incorrect requirements and standards can be utilized, sometimes resulting in improper or skewed assessment ratings. Moreover, without clear requirements and standard rating systems, such conventional assessment methods can result in rating reports which are subjective and/or difficult to interpret. In particular, a particular auditor may have slightly different views regarding a particular requirement and may be influenced by subjective factors. Also, one auditor may utilize a slightly different rating scale or evaluation system when compared to another auditor, or when compared to previous assessments made by the same auditor. The lack of detail provided in any written descriptions which may be utilized for the various categories and requirements allows for frequent inconsistency as well as widespread subjectivity from assessment to assessment. Accordingly, it can be very difficult to compare assessments of various business parties, and it can also be difficult to compare multiple assessments of the same business party. Consequently, it can be difficult to determine how one party compares to another, whether a party's goods or services should continue to be utilized, and whether a party is making improvements in any of the assessed categories. The comments and ratings provided by auditors in conventional assessments are usually very text intensive as well. Accordingly, during such assessments, the auditor usually takes notes on paper or dictates ratings and comments into a conventional dictaphone. Once all the categories and requirements have been evaluated, the auditor then utilizes the notes and recordings to prepare a report regarding the business party. However, preparation of such a report can be time consuming and inefficient, because organizing and properly presenting such notes can be complex. Moreover, in order to make the report understandable, the auditor usually must spend time to reiterate each assessment category and the requirements relating to that category before providing the evaluation for that category. As can be understood, such reiteration can be quite time consuming and inefficient, especially when there are a number of assessment categories and requirements. Finally, the report format, content, and style utilized by various auditors can vary, compounding the difficulty in quickly comparing and reviewing multiple assessments.
Assigning ratings for each category which is assessed can also be subjective. While the auditor may identify many deviations regarding the various requirements within the category, the business importance of each specific requirement and its deviations may be difficult to determine upon completion of the assessment.
Moreover, each assessment will typically result in a separate assessment report which is separately filed. In such a system, it can be difficult to aggregate, summarize, and or analyze information from various audits of the same or differing parties.
Consequently, it can be difficult to easily identify trends or areas where improvement is needed or where improvement has been made.
Accordingly, a method and system for efficiently evaluating business parties is needed. In particular, it is desirable to have such a method and system which ensures that the auditor will evaluate all relevant assessment categories and that clear and standardized requirements are utilized in evaluating these categories. Such a method and system is also needed which allows the auditor to quickly and efficiently enter evaluation ratings and comments, and which assists the auditor in quickly generating standardized reports from such ratings and comments. Moreover, a method and system is needed for easily accessing, summarizing, and analyzing various assessment reports which are generated.
SUMMARY OF THE INVENTION It is an object of at least one embodiment of the present invention to obviate the above-described problems. It is a further object of at least one embodiment of the present invention to increase efficiency in assessing and rating performance and/or practices of business parties.
Yet another object of at least one embodiment of the present invention is to provide a standardized system of assessing business parties. Another object of at least one embodiment of the present invention is to help ensure that all relevant requirements are evaluated in an assessment of a business party.
It is another object of at least one embodiment of the present invention to increase objectivity in assessments of business parties and to standardize ratings scales and assessment procedures. Yet another object of at least one embodiment of the present invention is to increase efficiency in preparation of reports regarding the assessment of business parties. It is a further object of at least one embodiment of the present invention to provide assessment reports which can be easily reviewed and compared.
Another object of at least one embodiment of the present invention is to enhance the ability to analyze and evaluate assessment reports regarding business parties. A further object of at least one embodiment of the present invention is to provide easy access to assessment reports regarding business parties.
Yet another object of at least one embodiment of the present invention is to create a common database of assessment comments and ratings for use in identifying common trends and/or areas for improvement. The above objects are provided merely as examples, and are not limiting nor do they define the present invention or necessarily apply to every possible embodiment thereof. Additional objects, advantages and other novel features of the invention will be set forth in part in the description that follows and will also become apparent to those skilled in the art upon consideration of the teachings of the invention. To achieve the foregoing and other objectives, a method is provided for obtaining the evaluation of the performance of a business party. The method comprises displaying a requirement corresponding to a business performance expectation, and displaying possible ratings to be selected for the requirement. The method also comprises receiving a rating input from a user indicating a selection of one of the possible ratings and saving the selected rating along with an identifier of the requirement to which it applies.
Still other objects of the present invention will become apparent to those skilled in this art from the following description wherein there is shown and described exemplary embodiments of this invention, including a best mode currently contemplated for carrying out the invention, simply for the purposes of illustration. As will be realized, the invention is capable of other different aspects and embodiments without departing from the scope of the invention. Accordingly, the drawings and descriptions are illustrative in nature and not restrictive in nature.
BRIEF DESCRIPTION OF THE DRAWINGS While the specification concludes with claims particularly pointing out and distinctly claiming the invention, it is believed that the same will be better understood from the following description taken in conjunction with the accompanying drawings in which: FIG. 1 is a perspective view of a handheld tablet PC and related devices, which can run software programs for use in assessing business parties, according to principles of the present invention;
FIG. 2 is a flow diagram illustrating an exemplary automated method of assessing business parties, according to principles of the present invention; FIG. 3 illustrates an exemplary assessment setup screen, which can be utilized to receive setup inputs, such as assessment type, focus, objectives and the like, according to principles of the present invention;
FIG. 4 illustrates an exemplary assessment category screen, which can be utilized to view pertinent assessment categories, according to principles of the present invention; FIG. 5 illustrates an exemplary hierarchical category screen, which can be utilized to view category standards and related subcategories and requirements, such as by selecting one of the categories of FIG. 4, according to principles of the present invention;
FIG. 6 illustrates an exemplary assessment subcategory screen, which can be utilized to view subcategories and related requirements, such as by selecting one of the subcategories of FIG. 5, according to principles of the present invention;
FIG. 7 illustrates an exemplary requirement rating screen, which can be utilized to enter comments and ratings regarding various assessment requirements, and which can be viewed by selecting one of the requirements of FIG. 6, according to principles of the present invention; FIG. 8 illustrates an exemplary requirement rating screen showing exemplary inputs of comments and ratings, according to principles of the present invention;
FIG. 9 illustrates an exemplary requirement viewing screen automatically showing the ratings given to various assessment requirements, according to principles of the present invention; FIG. 10 illustrates an exemplary assessment category rating screen, which can be utilized for entering observations and ratings regarding various assessment categories, according to principles of the present invention;
FIG. 11 illustrates an exemplary assessment status screen, which can be utilized to automatically view the progress of the assessment and to automatically summarize ratings assigned, according to principles of the present invention;
FIG. 12 is a schematic diagram of a computer network, which can be utilized to access various assessment reports from remote locations, according to principles of the present invention; FIG. 13 illustrates an exemplary assessment report access screen, which can be utilized to view and select assessment reports to be viewed, according to principles of the present invention;
FIGS. 14, 15, 16a, 16b, 19a, 19b, and 19c, illustrate various exemplary assessment reports that might be generated using the assessment data collected and stored, according to principles of the present invention;
FIG. 17 is a data table diagram illustrating an exemplary data structure for storing assessment requirements, related categories, subcategories, assessment types, objectives, and other data, according to principles of the present invention;
FIG. 18 is a schematic diagram illustrating an exemplary process of organizing all performance requirements into a central database, selecting pertinent requirements based upon user inputs, storing observations and ratings for the pertinent requirements, and allowing access to reports and data generated therefrom, according to principles of the present invention;
FIG. 20 illustrates an exemplary assessment properties screen, which allows the user to change the setup parameters and thereby change the pertinent requirements utilized during the assessment, according to principles of the present invention; and
FIG. 21 illustrates an exemplary rating guide which can be utilized by the assessor in determining a rating for an assessment category, according to principles of the present invention. DETAILED DESCRIPTION OF THE EXEMPLARY EMBODIMENTS Turning now to the drawings in detail, wherein like numbers illustrate corresponding structure, FIG. 1 is an exemplary arrangement comprising a perspective view of a portable computer and related devices, which can run software and/or firmware programs which operate according to principles of the present invention. In this embodiment, a portable computer 30 is utilized for viewing evaluation requirements and for providing evaluation inputs, such as ratings (e.g., scores, grades, rankings, etc.) or comments (e.g. observations, notes, opinions), regarding each requirement. In particular, the portable computer 30 includes a display 32 for viewing and inputting information regarding these requirements. The display 32 could comprise a monitor, a liquid crystal display (LCD), a touchscreen display (which, in addition to displaying information, also allows the user to enter inputs by touching the display with a finger or with a special pen or stylus 33), and/or any other viewing screen or interface. Any of a variety of devices may also be utilized for providing the evaluation inputs (such as text, ratings, descriptions, observations, graphics, photos, video and the like) regarding the party's performance during the assessment and or analysis of the assessment data. For instance, a keyboard 36 and/or mouse may be utilized to communicate with the computer 30, via wired or wireless channels, to provide inputs to the computer 30. Also, as noted, the display 32 may comprise a touchscreen display for entry of inputs via a user's finger, a stylus or pen 33, or other pointing device. A keypad 35, onscreen or hardware related, may also be provided to assist in entering data from the user.
As another option, a microphone 39 can be provided for use in conjunction with voice recognition software for providing inputs to the computer 30. An example of suitable speech-to-text software that may be utilized is NATURALLY SPEAKING by Dragon Systems, Inc. The microphone 39 may be part of a headset 38 which allows the microphone to be easily worn on the user's head during use, thereby freeing the user's hands. Alternatively, the microphone 39 can clip to the user's body or clothing, may have a separate stand, or may be integral with the computer 30. The microphone 39 can communicate with the computer 30 in any suitable manner, such as by plugging into a port 34 on the computer. Sound and video recording devices can also be utilized to record contemporaneous audio and video relating to the party's business performance with respect to various requirements. The microphone 39 and/or other microphone or sound input devices can be utilized to record the relevant audio. To provide video recordings to memorialize the business party's performance, a camera 37 can be utilized, such as a digital camera or still video camera for instance. Such a camera 37 can provide pictures to the computer 30 using any appropriate communication protocol, such as serial communication protocols or wireless protocols for example.
The portable computer 30 may comprise a general purpose or special purpose computer including a programmable controller or processor 31, such as a processor commercially available from Intel Corp. or Advanced Micro Devices, Inc. for example, and, a non- volatile memory device 33 for storing programs having instructions, such as on a hard disk drive, a CD ROM drive, a floppy disk drive, a flash memory unit, and/or a ROM chip. The computer 30 may also include volatile memory for use in executing programmed instructions, such as RAM or DRAM chips for instance, as well as appropriate circuitry for interfacing with the volatile and nonvolatile memory. Communication ports 34 and related circuitry may also be provided to interface the computer 30 with various devices, including input/output devices, such as keyboards, microphones, and printers for instance, as well as other computer devices via wired or wireless network or modem communications.
The portable computer 30 can comprise a computer provided with its own power source such as an onboard battery to allow the unit to be comfortably carried by a user (e.g., an auditor) for extended periods of time, such as while walking through the facility of a business party and conducting an assessment of the party's procedures, operations, organization, equipment, and or facilities. In one embodiment, the computer 30 comprises a tablet computer or pen computer, such as, for instance, the STYLISTIC or PENCENTRA models offered by Fujitsu Personal Systems, Inc. Such computers offer processing power, light weight, and long battery life. However, it should be understood that any of a variety of computers could be utilized with the methods and programs of the present invention, and the computers described herein may include laptop computers, notebook computers, desktop computers, portable data collectors, input stations, personal digital assistants (PDA's), portable electronic devices, Internet appliances, or other data input and/or display devices. The computer 30 may allow the user to easily view assessment requirements, enter ratings and comments, and prepare assessment reports, and, as will be understood, can take any of a virtually unlimited number of alternative forms. Standard operating system software can be utilized with the computer 30, if desired, such as WINDOWS operating system software for instance. An assessment assistant software or firmware program, described in further detail below, can be stored in memory 33 and run on the computer 30 to allow for inputting and reporting assessments. FIG. 2 is a flow diagram illustrating an exemplary series of steps according to which such software or firmware associated with the computer may operate. In addition, FIGS. 3-11 illustrate user interfaces, such as computer screen images for instance, which can utilize windows, frames, pages, icons, toolbars, menus, and/or text, in order to display and organize assessment requirements for the user, to receive assessment input from the user, and to automatically prepare assessment reports from the inputs. More specifically, as shown in FIG. 2, the user can begin an assessment by executing the stored assessment software program, as shown at block 110 of FIG. 2. This action can consist of selecting the appropriate program from a list or from a group of icons using an input device, or otherwise providing a predetermined input to begin execution of the program. The program may be written in any of a variety of languages suitable for creating software or firmware programs. For instance, languages such as C++, HTML, and/or Visual Basic may be utilized.
When the program begins running, the user may make a selection to create a new assessment file or to edit an existing assessment file, such as by using the input device to select a button or to select. from a menu. When the user requests to initiate a new assessment file, a setup screen can be displayed requesting that the user provide particular setup (e.g., characteristic) inputs regarding that new file, as shown at block 112 of FIG. 2.
FIG. 3 illustrates an exemplary setup window 200 that may be utilized for this purpose. In particular, the name of the auditor (assessor) can be requested by the program and supplied by the user in text box 202 (or automatically supplied by default on a particular user's device), the company or business party being assessed can be requested by the program and supplied by the user in text box 204, the location of the company can be requested by the program and supplied by the user in text box 206, and the scope or type of assessment can be requested by the program and supplied by the user in text box 208. Such text boxes or input boxes can be created in any of a variety of manners, depending on the programming language utilized. For example, commands are provided in Visual Basic to present suitable text or input boxes.
In addition to entering this identification information, the user may also enter further setup characteristic inputs to choose the variety and depth of requirements that will be utilized in assessing the business party. For instance, as shown in the exemplary embodiment of FIG. 3, a check list box 210 can be provided to allow the user to select whether the assessment is to be a "standard," "advanced," or "enhanced" assessment, and to thereby control the depth and/or number of assessment requirements which are utilized during the assessment. Any of a variety of data input boxes can be utilized to allow the user to select one or more of these assessment types, such as radio buttons, pick list or checklist boxes, and the like. Various suitable textboxes can be created using any of a variety of programming languages, including the Visual Basic language. It is contemplated that the invention might also include computer implemented instructions that would allow a user to customize the user screen, checklists and/or data entry steps for recording performance assessments during particular audits.
Other lists can also be provided to allow the auditor to expand or limit the requirements that will be utilized during the assessment based upon the assessment characteristics. For example, as shown in FIG. 3, list box 212 can be utilized to select the objectives (or focus) of the assessment, which can then be utilized by the program to narrow the listing of assessment requirements which will be presented to the user, and provide a more focused, tailored assessment for any particular application. In this example, the list includes issues related to development capability, data integrity, control, contamination, and material control. "Material control" relates to the business party's ability to identify and control specific lots of off-quality material from the supply chain, as well as the ability to identify the processing conditions (materials, facility, equipment, operators, settings, etc) that produced specific lots of materials. The correct labeling and use of materials also falls under this objective. "Contamination" relates to things not belonging in or on a material, and or minimum required purity levels. The "control" objective relates to the ability of the business party to understand and maintain qualified processing conditions, (e.g. Process Control, Quality Control, Change Control, Incoming Material Control, etc), the party's ability to validate to ensure that intended outcomes (or processes, systems, test methods, etc) are achieved, and the party's ability to ensure that suppliers are using the approved specifications. "Data integrity" relates to the party's use of testing data to ensure only acceptable materials or products are released to the company. "Development capability" relates to the party's ability to innovate in terms of new products/materials, processes, cost savings, solutions to problems, and the like.
These objectives create additional ways to identify relevant requirements for the assessment. For example, if the assessment was to follow up on a previous quality problem due to contamination issues, the user could select the "contamination" item to narrow down the universe of relevant requirements to only those related to contamination. However, additional and/or alternative objectives can be utilized depending on the type of business parties which are typically assessed by the auditors who will use the program. The selection of certain core objectives allows the user to include in the assessment only those requirements that relate to a specific objective (e.g., Contamination), allowing that particular assessment to be focused on key issues when adequate time is not available to do a complete assessment, or when a complete assessment is not necessary.
Other lists may also be provided to further expand or limit the evaluations made by the auditor. For example, certain requirements can be excluded from the assessment if the party has already been adequately certified, approved, and/or qualified with regard to another external organization. In particular, as shown in FIG. 3, the list box 214 allows the user to select whether requirements of the ISO (the International Organization for Standardization) will be excluded during the assessment, such as the requirements needed to be fulfilled in order to be certified by the ISO. If the party to be assessed is already ISO certified, then there would be no need to assess the party with respect to requirements overlapping or similar to those of the ISO. Likewise, a selection can also be provided in box 214 to allow the user to select whether the GMP (Good Manufacturing Practices) requirements will be excluded during the assessment. Other lists and items can be provided as desired to further limit or narrow the requirements that the user will evaluate during the assessment. Once all inputs have been provided in the setup screen of FIG. 3, the user may select the create new audit button 216 to use these inputs in setting up a file for the assessment and for selecting the requirements which will be utilized during the assessment.
Returning again to FIG. 2, at block 114, performance assessment inputs are supplied such as contemplated in this illustrative example in the setup screen, and, at block 116, they are utilized to select and display the relevant assessment categories. For example, if the user selected that "standard", "advanced", and "enhanced" requirements be utilized, a more lengthy set of categories and associated requirements would be selected and displayed than had the user selected only the "standard" requirements. Likewise, particular categories and requirements would be included depending on the items selected in the "objective" input box of the setup screen. Moreover, if the user selected ISO and/or GMP requirements, then any requirements similar to those used by those organizations will not be selected and displayed to the user.
FIG. 17 illustrates an exemplary relational database structure 90 which might be utilized for automatically determining which requirements to include in a given assessment, based upon the user's setup inputs, such as type, objectives, etc. In this example, the assessment requirements are arranged in a database system having records 250 (the assessment requirements) and various fields F associated with these requirements (whether the requirement is standard, advanced, or enhanced; the objectives to which the requirement relates; whether the requirement is related to ISO or GMP, etc.) Which fields apply to a given requirement can be provided in any number of suitable manners, such as by appending a code to the record, selecting a bit, etc.
As also shown in FIG. 17, this database 90 is in the form of a tree structure. By this, it is meant that each requirement 250 is related to a category 222, or key element, such as a starting materials category or a packing operations category, depending on the business area to which the requirement relates. In other words, requirements may be grouped depending on the business area or business function to which they are related. Similarly, each requirement might also be related to one or more subcategories 230 which fall under the main category. Rather than saving the actual category or subcategory descriptions, in order to save storage space, identifiers, such as codes, numerals, data, and/or letters for instance, could be stored in the database. These identifiers can be matched with the corresponding category or subcategory description for display or reporting purposes.
The common fields F allow the computer processor to quickly determine which requirements 250 relate to the selections provided on the start up screen. For instance, in the example of FIG. 17, had the user wished only to utilize ISO requirements during the assessment by selecting the appropriate item in the setup screen, the computer processor would select only Requirements 1.1.1, 1.1.2, and 2.1.1.2 from the universe of requirements in the database 90.
Returning again to FIG. 2, or block 116 of the exemplary, the program displays the categories which are relevant to those requirements which have been selected based upon the user's input. In other words, as described above, certain requirements are automatically selected by comparing the user's inputs provided in the setup screen with fields in the database. For these chosen requirements, the various categories to which these requirements belong are displayed. This is illustrated in the category window 220 of FIG. 4. In this view, the categories 222 related to the selected requirements are displayed in a list window 224. A scroll bar 226 can be provided to allow the user to scroll through this list. Identification information can also be displayed along with the list window 224, such as the business party name 204, the business party location 206, the audit scope/objective 208, the audit type 209 selected by the user in the setup window. Images 228 may also be displayed next to text which is displayed.
The user may select, such as by moving a pointer or cursor with an input device, any of the categories 222 in the list 224 window. This is shown in the exemplary flow diagram of FIG. 2. As shown in block 118 of FIG. 2, this selection is received by the computer. In response to the selection, any subcategories associated with the main category selected are displayed, as shown at block 120.
In particular, as described above with respect to FIG. 17, the requirements under the main category area can be sub-divided under subcategories. For example, certain assessment requirements in one application might fall under a main category entitled House Keeping/Pest Control/Maintenance. Under this main category, the requirements may be further divided into groups. For instance, some of the requirements may relate to "general principles" of the main category, others may be specific to "housekeeping", others may be specific to "pest control", and still others may be specific to "facility maintenance". The exact divisions of categories and sub-categories will depend upon the type of requirements which will be utilized in conducting assessments for an application, and can vary widely depending on the businesses that are assessed, the main business of the assessing company, and/or the level of quality desired.
FIG. 5 illustrates a subcategory window 229, which displays this hierarchical relationship in response to a selection of an assessment category by the user. In this case, the user selected the category "Housekeeping, Pest Control, & Maintenance" from the category window 220 of FIG. 4. In response to this selection, the computer displays a subcategory list 231 which shows the sub-categories 230 which fall under the selected category 222'. Such a hierarchical arrangement of the assessment requirement data can be achieved using appropriate display commands. For example, if Visual Basic is utilized, suitable commands are available to generate such nodes in a hierarchical arrangement:
As also shown in FIG. 5, a description window 240 is also provided within the sub-category window 229. This description window 240 displays information regarding the categories, sub-categories, and/or assessment requirements. In this example, a category description 242 is provided which describes the category 222' which was selected and/or the principle or standard to which the business party will be held with respect to that selected category. In addition, the subcategories 230 are listed, as well as any sub-subcategories 232 falling under each subcategory. Finally, all of the requirements 250 falling under the selected category 222' are displayed within the Window 229.
The category selected 222' can be given an overall numerical rating by the user, as well as a textual commentary observation. In particular, category rating buttons 234 are also displayed within window 229 in response to a category 222' being selected. These buttons 234 can be utilized in selecting an overall rating for that assessment category 222'. In this example, the ratings are numerical ratings of 2, 4, 6, 8, and 10, corresponding to unsatisfactory, poor, fair, good, and excellent performance in the selected category 222'. A button 234 can also be provided to indicate that the category 222' is still being rated, and another button 234 (N/A button) can be provided to indicate that the category is not applicable (e.g., it will not be rated). In addition, one or more comment entry boxes 236 can be provided to allow the user to input text describing the performance of the assessed business party in the selected category 222'. While category ratings and comments can be entered at this point using the buttons 234 and box 236, it is recommended that the user wait until the requirements 250 falling under the category 222' have been evaluated before making an overall evaluation of the category. Moreover, as described below, the program can automatically determine a suggested category rating based upon the ratings given for the requirements 250 under the category 222'. This suggested rating can assist the user in making an overall rating for the category 222'.
Using an input device, the user can also select a sub-category in the window 224 of FIG. 5 in order to view a list of any sub-subcategories which are related to the selected sub-category. In response to the selected subcategory 230', the related sub-subcategories 232 are displayed, as shown in window 233 of FIG. 6. In this example, the selected subcategory 230' is "General Principles", and the sub-subcategories 232 falling under this are "Owners Identified and Records Maintained". Also, in response to the subcategory selection, the information displayed in the window 240 is limited to the selected subcategory 230', as well as the related sub-subcategories 232 and the related requirements 250 falling under the selected subcategory 230'.
This process of selecting assessment subcategories 230 and displaying the related sub-subcategories 232 is shown in blocks 122 and 124 of FIG. 2. The user's selection from the input device is received at block 122, and the sub-subcategories are displayed at block 124, such as in the hierarchical tree format of FIG. 6, which allows the user to view all of the sub-subcategories 232, categories 222, and requirements 250 related to the selected subcategory 230'. Accordingly, the user can quickly view the relationship between the requirements 250 and the various groupings (i.e., categories 222 and subcategories 230).
As can be understood, the assessment requirements 250 may be divided into fewer, less, or no groupings as desired. The particular hierarchical arrangement chosen can depend on the number and type of requirements 250 that will be utilized with the automated assessment assistance system described herein. As noted above, the uses of this system can vary widely among companies, businesses, products, and services, and, accordingly, the assessment requirements 250 needed or desired can vary widely. In this illustrated embodiment, a first selection, such as a single click, on one of the categories 222 displays all related subcategories 230 and requirements 250 in the window 240, while a second selection, such as a double click for example, provides this same display in window 240 but also expands the tree format to display all related subcategories in window 224, as shown in FIG. 5. Likewise a first selection of a subcategory 230 will display all related sub-subcategories 232 and requirements 250 in window 240, while a second selection provides this same display in window 240 but also expands the tree format to display all related sub-subcategories in window 224, as shown in FIG. 5. Accordingly, the user can choose whether to "drill down" or expand the tree format of window 224. Other input mechanisms could be provided for this purpose. For example, view levels could be provided to allow the user to expand or limit the detail provided regarding the hierarchical arrangement of the categories 222 and related subcategories 230/232 shown in the window 224. Buttons, menus, icons, voice commands, and the like can be utilized to allow the user to select the desired detail level. In the window 240 of FIG. 5 or FIG. 6, the user may select any of the requirements
250 which are displayed, in order to rate the business party with respect to how closely the party meets the requirement. Upon selecting the desired requirement 250, the full description of that selected requirement will appear in the window 240, as shown in the requirement rating screen 249 of FIG. 7. In particular, the window 240 will expand to view the full description of the selected requirement 250', as well as the related subcategory 230' and sub-subcategories 232' of that requirement. In this way, an auditor or other user can verify that the proper or desired requirement is being viewed. In addition, upon selection of the desired requirement 250, requirement rating/criticality buttons 252 are also displayed to allow the user to rate the business party's performance with respect to that requirement. In this illustrated example, the buttons 252 are labeled "pending" for when the requirement 250' is still pending a rating by the user (this can be used as the default button), "N/A" for when the requirement is not applicable to the party, "excellent" for excellent performance with respect to the requirement, "acceptable" for acceptable performance with respect to the requirement, "minor" for minor problems with the performance expected (as described by the requirement description), "major" for major problems with the performance expected, and "critical" for critical problems with the performance expected. For instance, "excellent" could be used when the party has an outstanding system or result with respect to the requirement, "acceptable" could be used when the party meets the requirement, "minor" could be used when the party has minor issues with respect to the requirement which have low probability of affecting the quality or usability of the company's product, "major" could be utilized when the party has major issues with respect to the requirement which can reduce the overall product quality and thereby have an adverse affect on the usability or marketability of the company's product, and "critical" could be utilized when the issues with respect to the requirement have a high probability of resulting in a recall of the company's product or injury/illness to a consumer. Such "critical" issues could have a direct affect on the strength, identity, purity and effectiveness of the company's product, and could arise from direct evidence of product contamination or loss of integrity. While reviewing the business party's facilities and/or operations regarding the requirement 250', the user can select the appropriate button which the user feels describes the business party's performance in meeting the requirement. In addition, a text input box 254 can be used to allow the user to input one or more text comments relating to the business party's performance with respect to the selected requirement 250', such as any issues, problems, observations/commendations, improvements relating thereto. In one exemplary embodiment, voice recognition technology is utilized in entering such comments, for efficiency in conducting the assessment. Alternatively, when auditing in a noisy environment, notes can be transcribed using voice recognition after exiting the noisy area. For requirements rated with a "major" or "critical" rating, text should be provided in text box 254 to clarify exactly what issues or problems were observed for the requirement. Accordingly, during an assessment, assessors can compare the business party's operations and facilities to the defined quality requirements, and using the text input box 254, the assessor can note deviations from or satisfaction of the requirements via detailed descriptions. The text input box 254 can be used to record issues or problems observed, as well as to explain any risks due to issues/problems. The criticality buttons 252 can be used to identify the concern or significance of the issues observed. These requirement selection and rating steps are shown in the flow diagram of FIG. 2.
In particular, as shown at block 126, the requirement selection, such as a click or touch of the requirement, is received by the computer. In response to the requirement selection, the full requirement description is displayed, as well as the rating buttons and input boxes related to that requirement, as shown at block 128. The program then receives the rating which is made for that requirement, along with any comments that the user might have with respect to that requirement, as shown at block 130. These ratings and comments can be stored in a database, along with an identifier of the requirement being rating.
Turning now to the exemplary rating screen 249 of FIG. 8, the user has rated the selected requirement 250'. In particular, the selected rating button 252' is the "major" button, indicating "major" issues with respect to the requirement 250'. After the user rates the requirement 250' using a rating button 252, the computer can display a rating icon 256 corresponding to the selected rating 252'. In this example, a dot of a particular color is utilized to indicate the rating with respect to that requirement 252'. However, other images, icons, text, numerals, letters, indicia, and the like could be utilized and displayed next to a requirement 250' to indicate the current rating given by the auditor to that requirement.
The text 257 is input by the auditor using an input device, which may comprise a voice recognition device for maximum convenience, to indicate any comments that the auditor may have regarding the business party's performance with respect to the requirement 250'. For example, it may be desirable to provide additional detail regarding how the party performed with respect to the requirement 250', how the party could improve with respect to the requirement, or to otherwise take notes or memorialize observations with respect to the requirement.
Other requirements 250 for other categories 222 and sub-categories 230/232 can be viewed as described above with respect to FIGS. 4 - 6. The requirements can then be selected and rated as described above with respect to FIGS. 7 and 8. The user need only select the desired requirement 250, and use the buttons 252 and text box 254 to provide the rating. In addition to or as alternatives to the buttons 252 and box 254, other buttons, boxes, windows, icons, lists, selectable items, and the like could be utilized to assist the auditor in rating a requirement 250. Such an automated process does not require the auditor to remember the categories 222, sub-categories 230-232, and requirements 250, and can automatically select the pertinent requirements based upon the type of assessment desired, and other input provided in the setup screen. By listing all pertinent requirements 250, it can be ensured that all requirements are covered, and the risk that a requirement goes unrated is minimized. By displaying the standardized description of the requirement 250, the auditor need not attempt to remember such descriptions, thereby minimizing the risk that incorrect standards are utilized. Moreover, the use of computer technology in making ratings and observations can speed the process of making assessment reports, and can allow for analysis and processing of the ratings received, even as the assessment is taking place, as described in further detail below.
In particular, after rating the displayed requirements, the auditor can select another category 230 or subcategory 232 in the window 224 of FIG. 8. For example, the auditor could select the current category 230'. Such a selection will return the user to the viewing screen 233, as shown in FIG. 9. It should be noted that this is the same view as in FIG. 6, except that the ratings icons 256 show the ratings which the user provided for the requirements 250. Another difference is that the progress percentages 258, 260, 262, and 264 (which are displayed next to the assessment type 208, the categories 222, the subcategories 230, and the sub-subcategories 232 respectively) have been updated to reflect the requirements 250 which have been rated. In other words, the computer calculates the percentage of the total requirements which have been rated and displays this as a total progress percentage 258. Likewise, the computer calculates the percentage of the total requirements of a category 222 which have been rated and displays this as a category progress percentage 260. Similarly, the percentage of the subcategory requirements which have been rated are displayed as a subcategory progress percentage 262, and the percentage of sub-subcategory requirements which have been rated are displays as a sub-subcategory progress percentage 264. Accordingly, in addition to providing the hierarchical arrangement of the various requirements, the window 224 also provides the user with continual updates of his or her progress. More specifically, the user knows the percentage completed (i.e., rated) of the total requirements, category requirements, subcategory requirements, and sub-subcategory requirements. Thus, the user is given a continual progress report as the assessment takes place. In addition, in the exemplary embodiment, a suggested rating is calculated and displayed for each category 222. This suggested rating is shown with reference numeral 266 in FIG. 9, and is based upon the ratings given by the user for the various requirements within that category. A formula, algorithm, or calculation, such as an averaging calculation for example, can be utilized to provide these suggested category ratings 266. In the exemplary embodiment, the following algorithm/process is utilized: - When a category has 1 or more "Excellent" ratings and no "Minor", "Major," or "Critical" ratings for the requirements within it, it receives a 10.
- When a category has 4 or less "Minor" ratings and no "Major," or "Critical" ratings for the requirements within it, it receives an 8.
- When a category has 5 or more "Minor" ratings but no "Major" or "Critical" ratings for its requirements, it receives a 6.
- When a category has less than 3 "Major" ratings and no "Critical" ratings for the requirements within it, it receives a 4.
- When a category has less than 5 "Major" ratings and 1 or 2 "Critical" ratings for its requirements, it receives a 2. - When a category has 3 or more "Critical" ratings or 5 or more "Major" ratings for the requirements within it, it receives a 0.
As can be understood, such an algorithm can be programmed to allow for the automatic calculation of a suggested category rating. For example, the algorithm could be programmed as one or more condition statements, such as "IF-THEN" statements, to automatically determine a suggested category rating based upon the ratings given to the requirements within the category. In the exemplary embodiment, the user may override the suggested rating, although override capability need not be permitted if it is desirable to force the user to utilize the automatically determined category rating.
Turning again to FIG. 2, blocks 132 and 134 illustrate these progress update and rating suggestion steps of this exemplary method. In particular, at block 132, the ratings for rated requirements are displayed, such as, for example, next to the requirement description or other identification code for the requirement. In addition, progress percentages are displayed at block 132, to inform the user as to how much of the assessment has been completed and how much still remains to be completed. At block 134, the suggested category rating is calculated from all of the rated requirements within the category, and this suggested rating is displayed for use by the auditor. In particular, the suggested ratings 266 of FIG. 9 can be utilized by the auditor in providing overall ratings for each category 222. As noted above, the auditor can select a category 222, such as by clicking on or otherwise pointing to the desired category. In response to such a selection, the screen 229 is again displayed, as shown in FIG. 10. However, the suggested rating 260' for the selected category 222' has changed from the default (in this case 8) to a value based upon the recently entered ratings for the requirements 250 falling within that category 222'. The user may utilize this suggested rating, or, altematively, the user may provide another rating using the category rating buttons 234. In the exemplary embodiment, these buttons 234 allow the assessor to assign a quantitative rating on a scale of 1-10 which corresponds with the business party's degree of implementation and effectiveness relative to the principled intent of the category 222. For example, a rating of 10 could be used when excellent implementation and excellent effectiveness of the category standard/description 242 have been observed, a rating of 8 could be used when full implementation and full effectiveness of the category standard 242 have been observed, a rating of 6 could be assigned when satisfactory implementation and partial effectiveness of the category standard 242 have been observed, a rating of 4 could be assigned when partial implementation and partial effectiveness of the category standard 242 have been observed, a rating of 2 could be used when some attempt at implementation but no effectiveness of the category standard 242 have been observed, and a rating of 0 could be appropriate when no implementation and no effectiveness of the category standard 242 have been observed. In order to help ensure more accurate category ratings, the category rating button 234 should be assigned after the requirements within the category 222 have been adequately assessed. In addition, the user may provide comments in text box 236 to describe or summarize the overall performance of the business party with respect to the selected category 222'. These category ratings and comments can then be saved in a database, with an indication of the category to which they apply.
These steps of providing category ratings are shown in blocks 136 to 142 of FIG. 2. In particular, at block 136, the user decides whether to enter more ratings for requirements within a category. If more ratings are to be entered for the requirements within the category, the process continues to block 126, and the user can select and rate the desired requirements, as described above. However, if no more ratings are to be entered, the user can select the category in order to provide an overall category rating. In particular, as shown at blocks 138 and 140, a category selection is provided from the user, and, in response to this selection, category ratings buttons and a category comment box are displayed. Accordingly, the user can input a rating for the category as well as comments regarding the performance observed for the category. Once received, these inputs are saved, as shown at block 142.
Other features can also be provided as desired with such a system and method. For example, as shown in FIG. 11, a status screen 280 can be displayed when the user makes a particular selection, such as from a menu, button, or icon. In particular, the status screen 280 can illustrate how far along the assessment is, as well as give a summary of the ratings given so far. For example, the screen 280 can list the number of requirements rated so far, out of the total requirements to be rated, as shown by reference numeral 282. Moreover, the categories rated so far out of the total categories to be rated can also be displayed, as shown by reference numeral 284. Other progress numbers can also be displayed, such as the percentage 286 of requirements rated, as well as the number 288 of requirements rated. A running total of the ratings can also be displayed, such as the number of not applicable ratings, the number of excellent ratings, the number of acceptable ratings, the number of minor ratings, the number of major ratings, and the number of critical ratings, as shown by reference numerals 290-295. In addition, a capability rating 298 can also be provided based upon the requirements rated so far, in order to provide a suggested overall score for the entire assessment. Such an overall score 298 can be automatically calculated by the computer using a formula, algorithm, calculation, or the like. For instance, the following calculation can be utilized: Capabilitv=(YCategory Ratings>=8)+0.5*('Categorv. Ratings=6 -(Categorv Ratings<=4')')*100%
Total Requirements Rated The score can then be displayed along with the type/depth of assessment conducted (e.g., standard vs. advanced).
In addition to assisting auditors in making assessments as described above, the exemplary embodiment of the present invention can also automatically assist in the preparation of assessment reports. In particular, as shown in FIG. 12, the assessing computer 322 can transfer and share the collected assessment data with one or more other computers 324 and 326, and/or with one or more servers 320. The connection 321 between these devices can be a network connection, such as via Internet, Intranet, and/or Ethernet, or can be a wired or wireless peer-to-peer connection, such as via modem or other communication device. In the exemplary embodiment of FIG. 12, the assessing computer 322 includes a processor 334 which executes assessment software in memory 336. The assessment software can operate as described above with respect to FIGS. 1-11. The assessment input data saved using such software can be automatically placed into a report format and uploaded to server 320, which utilizes a processor 330 to store the data in central database 332. The connection 321 can allow access to this data by other computing devices 324, 326, such as through an Internet or Intranet connection. Accordingly, an assessment data recorded by one computer 322 can be quickly and easily accessed, analyzed, summarized, and viewed by other computers.
FIG. 13 provides an example of browser software which may be utilized by the computers 322, 324, and 326 to access the assessment data stored in the central database 332. In this example, the browser software is the NAVIGATOR browser made by Netscape, although other browser or viewing software could be utilized, such as the EXPLORER browser made by Microsoft for instance. The browser window 340 provided by such software can list and allow access to the assessment input data and/or reports prepared by various auditors with respect to various business parties. In the example of FIG. 13, each assessment report is listed by business party (e.g., supplier) name 344, city 346, country 348, date of the assessment 349, and assessment objective or scope 350. The user can select one or more of these fields to then access the actual assessment report displaying and/or summarizing the assessment data collected. Alternatively, a separate or additional icon or link 352 can be provided to allow selectable access to the report.
Accordingly, by providing the central database 332 of report data, quick and efficient access to assessment reports is provided, even from remote locations. Moreover, the central database or repository 332 of assessment reports and/or data can be utilized to easily determine when to assess suppliers, which supplier sites to assess, to what depth (e.g., standard vs. advanced) to assess suppliers, and how often to re-assess suppliers. This can be determined, automatically or manually, based upon the length of time since the last assessment, the ratings given in previous audits, the number of critical requirements found, the number of audits conducted on the party, the number of critical requirements observed, the importance of the party to the company's product/services, the experience of the party with the company, recent events which might indicate a need for an interim assessment, etc. In addition, because all assessments are held in a central database, or repository 332, which is easily accessible, the risk is minimized of redundant assessments of the same business party by various divisions within the company. Accordingly,, prior to conducting an assessment, procedures might require the assessor to access the central database 332 to ensure that a relevant assessment has not already been conducted recently. The system might also be set up to automatically contact the central database, such as via a wireless remote connection, to undertake this check whenever a new audit/assessment is initiated. It is also contemplated that prior to commencing recordation of a new assessment, the user may want to access information as to when the last audit of a particular type was completed for a particular business, and the results of that inspection. Such access could be requested and reviewed via the methods and systems of the present invention. Appropriate security measures can be taken to restrict read and/or write access to the assessment data in the central database 332, and/or to otherwise control the information which can be assessed by a remote user. The assessment collection software described above can also allow for exporting of the data collected to a compilation database, separate from or integral with database 332, which allows for all data from all assessments to be searched, summarized, and otherwise analyzed simultaneously. This can be achieved, for example, by adding assessment data to the compilation database upon completion of an assessment, or by concatenating data from all assessment reports generated by the auditors. Such a compilation database can be used for trend analysis to identify requirements which a party or multiple parties are frequently having problems fulfilling, and/or to quickly locate the most problematic/critical requirements. Summaries of requirement and category ratings and evaluations can be generated from the compilation file to assist in identifying common causes of problematic areas. Other data or statistical analysis can be conducted as desired as well. FIGS. 14-16 illustrate exemplary reports 358 that can be automatically created by the assessment software described herein, or by separate software, from the assessment input provided by the user, such as described above. In the example of FIG. 14, the assessed parties name 360 and location 362, the auditor's name 364, the assessment scope 366, and an internal assessment name or code 368 are displayed. Assessment data 369 is also displayed, in table format in this exemplary report. In particular, the assessment date 370 is displayed as well as the days elapsed 372 since the assessment. In addition, the report 358 displays the total number 374 of records in the database, the number 376 of requirements assessed, the percentage 378 of the total applicable requirements that have been assessed, as well as the total requirements 380 which remain unassessed (e.g., which are "pending"). Data regarding the ratings given to the assessed requirements can also be displayed, such as the total number 382 of requirements rating "inapplicable", the total number 384 of requirements rated "excellent", the total number 386 of requirements rated "acceptable", the total number 388 of requirements rated "minor", the total number 390 of requirements rated "major", and the total number 392 of requirements rated "critical". These requirement ratings can be recorded using the requirement rating buttons 252 described above, or a similar input mechanism and user interface.
Such data also can be displayed with graphs or charts, if desired. For instance, pie chart 394 is displayed in report 358, with a piece 391 indicating "minor" ratings, a piece 393 indicating "major" ratings, a piece 395 indicating "critical" ratings, a piece 396 indicating "pending" ratings, a piece 397 indicating "critical" ratings, a piece 398 indicating "excellent" ratings, and a piece 399 indicating "acceptable" ratings. Other methods or views for display of data could be utilized in addition to or as alternatives to those of exemplary report 358. For instance, in FIG. 15, the report 358 is in table format and shows the ratings which were given for each of the categories 222. In particular, displayed near the top of the report 358 is identification data, including the name 360 and location 362 of the assessed party, the name 364 of the auditor who assessed the party, the date 370 of the assessment, and an identification code 371 for the assessed party. Moreover, within a table 359 are displayed the categories 222 (i.e., category descriptions) which were assessed for the assessment, as well as a corresponding category rating 235 for each of the listed categories. These ratings 235 can be entered and recorded using the category rating buttons 234 described above, or a similar user interface input device. In addition, one or more comments 237 are displayed for each category 222. These comments 237 can be entered using the category comment entry box 236 described above, or similar user input mechanism. In addition, an overall rating is provided for the assessment. In this example, the overall rating is the capability rating 298 which is automatically calculated by the computer, such as via the equation 299.
FIGS. 16a and 16b illustrate another assessment report format that may be utilized. Identification data 360, 362, 364, and 370 are displayed at the top of the report 358. The body of the report includes a table 359 which lists each of the relevant categories 222, as well as the rating 235 applied to each category. In this example, columns are provided for each possible rating, and X's or other indicators are placed in the appropriate column to indicate the rating given for the category 222. In the table 359, the comment 237 saved for each category is also displayed. The ratings 235 can be assigned using the rating buttons 234 described above, and the comments 237 are assigned using the input boxes 236 described above.
The reports 358 described above can be automatically generated by a report generation program, which may be part of the assessment entry software described above with respect to FIGS. 1-11 (such as program 336 of FIG. 12), or which may be provided as a separate program (such as program 332 illustrated in FIG. 12.) The reports 358 may be generated automatically by accessing the assessment inputs stored during the conducting of the assessment. Rather than saving the category description 222 or requirement descriptions 250 with this data, identifier codes can be saved. Then, when the report is generated, the codes can be matched up with the corresponding descriptions for display in the report 358, if described. For each identifier, the rating given by the auditor is also stored, as well as the auditor's comment(s). Accordingly, the reports 358 can be easily and automatically generated once the assessment ratings have been collected.
FIG. 19 illustrates an additional exemplary report that may be automatically generated using an assessment entry program made in accordance with principles of the present invention. In this example, the report includes an assessment summary portion 401 which provides a summary of the assessment conducted. In particular, various identification information can be presented, such as a the business party's identification information 402, the business party's contact information 404, and the assessor identification information 426. Some or all of this information can be collected using the assessment collection software, such as by using the setup information screen of FIG. 3, or the like. Additional information can also be collected regarding the assessed party, such as, for example, a description 406 of the assessment purpose, a description 408 of the facilities, operation, or warehouse assessed, a description 410 of the business party assessed, and a description 412 of other external audits, certification, or associations of the business party. This information can likewise be entered using a screen similar to FIG. 3, and automatically saved with the assessment inputs. Alternatively, the assessor can be requested to provide this information upon generation of the report.
Summary information regarding the assessment can also be provided in the summary section 401. In particular, a summary 414 of the assessment findings can be provided, as well as a summary 424 of the actions which will be taken by the business party in response to the assessment. Such summary data can be collected by the assessment entry program upon the completion of the assessment, or can be requested upon generation of the report. The assessment data taken during the assessment can also be summarized and displayed. In particular, the "Capability" scores 416 for the current and previous assessment can be displayed, as can the number 418 of requirements rated "critical", the number 420 of requirements rated major, and the number 422 of requirements rated "minor". Such summary data can be automatically calculated from the assessment data collected as described above.
In addition, the exemplary report includes a category rating section 403, as shown in the example of FIG. 19b. In this example, the identification information 402 is repeated at the top of the section 403, and the capability score 417 for the assessment is displayed, along with the calculation 428 used to generate the score. In addition, in table 429, the pertinent categories 222 which were assessed are listed, along with the ratings 235 given during the assessment for those categories, as well as any summary comments 237 provided for each category. In addition, a rating key 430 can be displayed to indicate the meaning of the various ratings on the scale. Moreover, the exemplary report includes a requirement rating section 405 which lists or summarizes the requirements which were not satisfied and/or for which issues or problems were observed. Again, in this example, the identification information 402 is repeated at the top of the section 405. In addition, the categories 222 which had problematic requirements are displayed. Under each category 222 is displayed the standard principal 242 of that category, as well as a system description 432, summary comments 434, and general observations 441 relating to the corresponding category. This information can be collected using text entry boxes, such as the box 236 of FIG. 10 or the like, during or after the assessment. In addition, the specific observations 257 collected during the assessment (such as by using the screen of FIG. 8 with the box 254) regarding the problematic requirements are displayed. In particular, "major" requirements are grouped under heading 436, and the requirements descriptions 250 are listed for the requirements rated as "major." Similarly, the "minor" requirement descriptions 250 are listed under the "minor" heading 438, and the "acceptable" requirement descriptions 250 are listed under the "acceptable" heading 440. Under each heading 436, 438, and 440 are displayed the comments 257 made during the assessment with respect to the requirements 250 falling under that heading. These requirement descriptions 250 and comments 257 can be automatically selected and displayed in the section 405 directly from the data collected by the program during the assessment, and need not be re-typed or re-entered by the user, thereby providing for efficient report generation. The requirements 250 and related observations 257 can also be automatically grouped under the correct heading 436, 438, and 440, further assisting the user in the report generation.
Other features and functions can also be provided with the report collection software described above. In particular, as shown illustrated in the exemplary assessment setup edit screen 450 of FIG. 20, the user can edit the setup information regarding the assessment during the assessment, to further include or exclude requirements during the assessment. Such a screen can be automatically displayed upon the selection of an icon or menu, such as the "tools" menu 451 of FIG. 9. The screen 450 allows the user to add or modify the various identification information 202, 204, 206, and 208. In addition, the setup information utilized to automatically select the pertinent requirements can be modified as well. For example, the user can change the assessment type/focus using list 210, the assessment objectives using the list 212, and the external standards or certifications using the list 214. Once modified, the information can be saved and the pertinent requirements updated by selecting the "save" button 217.
Moreover, FIG. 21 illustrates a ratings guide 460 that can be displayed during the assessment, such as by using an icon or menu, such as a "help" menu. The ratings guide 460 can assist the assessor in assigning one of the ratings 235 for the category 222 which is currently selected. In particular, each subcategory 230 related to the selected category 222 can be displayed. Under each subcategory is displayed an explanation 462 of when the corresponding rating 235 in the table is applicable to that subcategory (i.e., what performance merits that rating.) Using such a guide 460, the assessor can better determine what overall numerical rating or grade 235 to assign to a particular category.
FIG. 18 is a process diagram illustrating the consolidation of requirements into a central database, the selection of relevant requirements from that database, the rating of the relevant requirements, and the generation of reports therefrom. In particular, a number of different requirements can be utilized for various assessment situations 270. For example, audits focused on contamination issues would use a particular set 270 of requirements. These requirements can be combined into a single requirement database 272. Each requirement 273 can be saved in the database 272 along with a identifier (e.g., code, number, bit, etc.) 274 of the categories and sub-categories to which it is related. Also, each requirement 273 can be saved with an identifier 276 of the situations in which it is to be used. For example, a certain requirement 273 may be useful when contamination is a focus or objective of the assessment. Moreover, that requirement may be detailed and therefore applicable for advanced type audits, as opposed to standard, or less detailed type audits. The central storage of the requirements in database 272 also allows for ease of modifications of requirements and/or categories as desired and needed.
Upon initiating an assessment, and/or during the assessment, the user provides setup inputs regarding the type, objectives, scope, focus, and/or depth of the assessment to be conducted. Based upon these inputs, a subset 278 of the relevant requirements is automatically selected and/or saved, and then displayed to the user by selecting these requirements having identifiers 276 corresponding to the setup input. The user then enters ratings and/or observations with respect to the various requirements. In addition, ratings and observations can be made with respect to the various categories of the various requirements in the subset 278, as well as an overall rating for the assessment. The ratings and observations can be stored, along with the requirements to which they relate. Then, from the stored data, a report 281 can be automatically generated showing the categories and requirements and the ratings/observations regarding each, such as by matching identifiers for requirements and categories with descriptions relating thereto. The data can be automatically summarized, sorted, analyzed, and formatted as desired in the report 281. Each report 281 from each assessment can then be stored in a central report repository 285, such as in electronic format, which can be easily accessed by users, such as by using electronic devices in various remote locations. Moreover, the data can be exported to a central data repository 283 where it is combined with data from other audits. This data repository 283 can be accessed by users for electronically sorting, searching, summarizing and analyzing the data of all assessments, as well as identifying critical issues and trends. In addition, in conducting a subsequent assessment of party, data from the previous report 281 of that party can be automatically inserted into the data entry fields of that subsequent assessment entry file 207 as default values. The user can either select that previous assessment data from the report repository 285 and save it as a new assessment file, or the user can select a party to audit and, in response to that selection, the previous data for that party is automatically selected from the repository 285 or 283 and inserted into the data entry fields. (For example, with respect to FIG. 8, the text observations 257 can be automatically inserted and the rating buttons 252 can be automatically selected for the various requirements.) The user can then know the previous comments and ratings for the party with respect to the various requirements, but is free to change these as appropriate during the subsequent assessment. Having the observations and ratings automatically available in the subsequent assessment entry screens 287, allows the user to easily compare subsequent performance to these previous observations and rating, so as to identify whether improvement has been made. Pick lists and criticality lists can also be provided and displayed during the execution of the subsequent assessment data entry program to identify areas which were considered problematic during the previous assessment, to ensure that the assessor addresses these areas during the subsequent assessment. For example, with respect to FIG. 18, data from the previous assessment is provided in the subsequent assessment data entry screens 287. In particular, default observations 289 from the previous assessment are provided in the data entry screen 287, and a list 291 of issues from the previous assessment is displayed. (For instance, a list of requirements rated "Critical" during the previous assessment could be displayed.)
Accordingly, the embodiments disclosed above can provide efficiency and objectivity to the auditing process. Relevant requirements can be selected and displayed quickly and efficiently, so that the auditor utilizes the correct requirement descriptions. In addition, ratings and comments with respect to the various requirements can be entered easily by the user (e.g., using a portable computer and voice recognition) to make the assessment process more efficient. Morever, suggested category ratings and overall assessment ratings can be automatically generated as well, to assist the user in providing ratings for the various broad categories and for the assessment as a whole, and reports and statistical analysis can be automatically generated at the conclusion of the assessment. A central repository of assessment reports provides wide access to assessments, and the data from all assessments can be combined in a common database to allow for automatic creation of summaries and reports and to identify trends. The use of a range of possible ratings or criticality factors can allow the user to assess the requirement according to predicted impact on the company's business, thereby reducing the amount of subjectivity in the assessment
The foregoing descriptions of the exemplary embodiments of the invention have been presented for purposes of illustration and description only. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed, and modifications and variations are possible and contemplated in light of the above teachings. While a number of exemplary and alternate embodiments, methods, systems, configurations, components, and potential applications have been described, it should be understood that many variations and alternatives could be utilized without departing from the scope of the invention. For instance, the requirements, categorization of requirements, assessment objectives, and assessment types described above can vary widely depending on the types of business party's which will be audited, and can be easily tailored for the audits contemplated. Moreover, the rating scales utilized, the meaning of the ratings, and display and report format can be easily modified as desired. Other options are also possible. For instance, if the user could not carry a computing device during an assessment, the software could still be utilized to select only the appropriate requirements for the selected assessment type, objectives, certifications, etc. Then, the selected requirements could be printed and the hard copy could be utilized during the assessment. (Accordingly, displaying can include displaying on a screen and/or printing on paper.) Moreover, any of a variety of display formats, input interfaces, progra-r-ming languages, database configurations, hardware and software, etc. can be utilized to implement the various aspects discussed above. For example, a portable digital device could contact a central computer which includes the assessment input and display screens. Based upon the user setup information, the relevant assessment input requirements can be downloaded from the central computer to the digital device, in parts or as a complete set. Inputs can be provided on the digital device to rate the assessment requirements and this information can be uploaded to the central computer and/or saved on the portable device. Thus, it should be understood that the embodiments and examples have been chosen and described in order to best illustrate the principals of the invention and its practical applications to thereby enable one of ordinary skill in the art to best utilize the invention in various embodiments and with various modifications as are suited for particular uses contemplated. Accordingly, it is intended that the scope of the invention be defined by the claims appended hereto.

Claims

WHAT IS CLAIMED IS:
1. A method in a computer system for obtaining the evaluation of the performance of a business party, the method comprising the steps of: displaying a plurality of assessment categories; receiving a category input from a user indicating the selection of one of said assessment categories; in response to the category input received, displaying a plurality of assessment requirements, wherein the assessment requirements are related to the assessment category selected; receiving an evaluation input regarding a business party's performance with respect to at least one of said assessment requirements; and saving the evaluation input.
2. The method as recited in claim 1, wherein the evaluation input comprises textual comments provided through a voice recognition program.
3. The method as recited in claim 1, further comprising displaying a plurality of requirement rating choices, wherein the evaluation input is provided by selection of one of a plurality of requirement rating choices.
4. The method as recited in claim 1, further comprising: saving the evaluation input and an identifier of the requirement to which the evaluation input pertains.
5. The method as recited in claim 1, further comprising: receiving a category rating input indicating a rating of a business party's performance with respect to at least one of the assessment categories.
6. The method as recited in claim 5, wherein the category rating input is provided by selection of a category rating button.
7. The method as recited in claim 5, further comprising: receiving a comment input regarding an observation of the business party's performance with respect to at least one of said assessment categories.
8. The method as recited in claim 1, further comprising: receiving additional evaluation inputs indicating observations of a business party's performance with respect to assessment requirements; calculating a suggested category rating based upon the evaluation inputs; and displaying the suggested category rating.
9. The method as recited in claim 1, further comprising: in response to the evaluation input, displaying a rating icon adjacent the assessment requirement to which the evaluation input pertains.
10. The method as recited in claim 1, further comprising: in response to the evaluation input, calculating a percentage of requirements assessed; and displaying the percentage.
11. The method as recited in claim 1, further comprising: in response to the category input, displaying a plurality of assessment sub-categories; receiving at least one sub-category input from a user indicating the selection of one of said sub-categories, wherein the assessment requirements are displayed in response to the at least one sub-category input.
12. A method in a computer system for obtaining the evaluation of the performance of a business party, the method comprising the steps of: displaying an assessment setup screen; receiving a setup input from the user characterizing the assessment which will be conducted; in response to said setup input, selecting a sub-set of requirements to be used in the assessment from a set of requirements; displaying requirements from the sub-set; receiving evaluation inputs from a user indicating a business party's performance with respect to the requirements; and saving the evaluation inputs.
13. The method as recited in claim 12, wherein the setup input identifies the depth of the assessment to be conducted.
14. The method as recited in claim 12, wherein the setup input identifies the objectives of the assessment to be conducted.
15. The method as recited in claim 12, wherein the setup input identifies prior certifications of the business party to be assessed.
16. The method as recited in claim 12, further comprising: displaying assessment categories related to the requirements in the sub-set.
17. The method as recited in claim 16, further comprising: receiving a category input from a user indicating the selection of one of said assessment categories; and displaying assessment requirements related to the assessment category selected.
18. A method in a computer system for obtaining the evaluation of the performance of a business party, the method comprising the steps of: displaying a requirement corresponding to a business performance expectation; displaying possible ratings to be selected for the requirement; receiving a rating input from a user indicating a selection of one of said possible ratings; and saving the selected rating along with an identifier of the requirement to which it applies.
19. The method as recited in claim 18, further comprising receiving a comment input regarding observations of the business party's performance with respect to the requirement.
20. The method as recited in claim 19, wherein the text input is received through a speech recognition program.
21. The method as recited in claim 19, further comprising: displaying a text input box for use in receiving the comment input.
22. The method as recited in claim 18, further comprising: in response to the evaluation input, displaying a rating icon adjacent the requirement.
23. The method as recited in claim 18, further comprising: in response to the evaluation input, calculating and displaying a percentage of requirements rated.
24. A method in a computer system for generating an assessment report regarding the performance of a business party, the method comprising the steps of: receiving an evaluation input regarding an observed performance of a business party with respect to an assessment requirement; saving a requirement identifier and the requirement input in a memory device; automatically generating an assessment report by the steps of: accessing the saved requirement identifier and evaluation input; producing on a report a description of the requirement corresponding to the requirement identifier; and producing on the report the evaluation input.
25. The method as recited in claim 24, wherein the evaluation input comprises a criticality rating indicative of the criticality of issues observed with respect to the requirement.
26. The method as recited in claim 24, wherein the evaluation input comprises a textual input corresponding to comments regarding the observed performance with respect to the requirement.
27. The method as recited in claim 24, further comprising: receiving a category evaluation input regarding an observed performance of a business party with respect to an assessment category; saving a category identifier and the category input in a memory device; wherein the generating step further comprises: accessing the saved category identifier and category input; producing on the report a description of the category corresponding to the category identifier; and producing on the report the category evaluation input.
28. The method as recited in claim 27, wherein the category evaluation input comprises a category rating.
29. A system for use in obtaining the evaluation of the business performance of a business party, comprising: a portable computer device; an input device configured to provide inputs to the portable computer device; and a program adapted to display assessment requirements corresponding to a business performance evaluation and to allow input through use of the input mechanism of evaluation inputs regarding a business party's performance with respect to the requirements.
30. The system as recited in claim 29, wherein the input mechanism comprises a microphone and a voice recognition program configured to be executed by said computer device.
31. The system as recited in claim 29, wherein the program is further adapted to automatically generate a report based upon the evaluation inputs.
32. The system as recited in claim 32, wherein the program is further adapted to automatically select a sub-set of pertinent assessment requirements from a set of requirements based upon inputs to the computer device.
33. The system as recited in claim 29, wherein the program is further adapted to print said selected pertinent assessment requirements as a printable form for use in recording evaluations regarding the selected pertinent assessment requirements.
34. The system as recited in claim 29, wherein the program is further adapted to export the evaluation inputs to an assessment compilation database.
35. A method for assessing business parties, comprising: inputting assessment requirements for a variety of assessment situations into a database; selecting at least one assessment characteristic; based upon the assessment characteristic, automatically selecting pertinent requirements from the database; displaying the pertinent requirements; observing a business party's performance with respect to the pertinent requirements; and providing evaluations of the business party's performance with respect to the pertinent requirements.
36. The method as recited in claim 35, further comprising: transferring the evaluations to a compilation database;
37. The method as recited in claim 35, further comprising: generating a report based upon the evaluations.
38. The method as recited in claim 37, further comprising: transferring the report to a central repository.
39. The method as recited in claim 37, wherein the evaluations comprise ratings and textual comments.
40. A business performance assessment system, comprising: an assessment computer having a display; an assessment input device; a database of assessment requirements; programmed instructions configured to display assessment requirements selected from said database on said assessment computer display, wherein the programmed instructions are further configured to allow evaluation inputs regarding business performance to be provided to said assessment computer using said input device; a memory configured to store said evaluation inputs.
41. The system as recited in claim 40, wherein the computer comprises a personal computer;
42. The system as recited in claim 40, wherein the computer comprises a portable data collector.
43. The system as recited in claim 40, wherein the programmed instructions are further configured to automatically calculate a rating based upon evaluation inputs.
44. The system as recited in claim 40, wherein the programmed instructions are further configured to automatically generate an assessment report based upon evaluation inputs.
PCT/US2001/051000 2000-11-10 2001-11-09 Method and system for facilitating assessments WO2002065232A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2002251719A AU2002251719A1 (en) 2000-11-10 2001-11-09 Method and system for facilitating assessments

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US70996300A 2000-11-10 2000-11-10
US09/709,963 2000-11-10

Publications (3)

Publication Number Publication Date
WO2002065232A2 true WO2002065232A2 (en) 2002-08-22
WO2002065232A8 WO2002065232A8 (en) 2003-10-30
WO2002065232A3 WO2002065232A3 (en) 2004-03-25

Family

ID=24852037

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/051000 WO2002065232A2 (en) 2000-11-10 2001-11-09 Method and system for facilitating assessments

Country Status (2)

Country Link
AU (1) AU2002251719A1 (en)
WO (1) WO2002065232A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004055699A1 (en) * 2002-12-17 2004-07-01 Hearts And Minds Crm Pty Ltd A method for analysing human relationships in a business network
US20120297330A1 (en) * 2011-05-17 2012-11-22 Flexigoal Inc. Method and System for Generating Reports
US11157858B2 (en) 2018-11-28 2021-10-26 International Business Machines Corporation Response quality identification

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5627973A (en) * 1994-03-14 1997-05-06 Moore Business Forms, Inc. Method and apparatus for facilitating evaluation of business opportunities for supplying goods and/or services to potential customers
US6161101A (en) * 1994-12-08 2000-12-12 Tech-Metrics International, Inc. Computer-aided methods and apparatus for assessing an organization process or system
US6301571B1 (en) * 1996-09-13 2001-10-09 Curtis M. Tatsuoka Method for interacting with a test subject with respect to knowledge and functionality

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5627973A (en) * 1994-03-14 1997-05-06 Moore Business Forms, Inc. Method and apparatus for facilitating evaluation of business opportunities for supplying goods and/or services to potential customers
US6161101A (en) * 1994-12-08 2000-12-12 Tech-Metrics International, Inc. Computer-aided methods and apparatus for assessing an organization process or system
US6301571B1 (en) * 1996-09-13 2001-10-09 Curtis M. Tatsuoka Method for interacting with a test subject with respect to knowledge and functionality

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004055699A1 (en) * 2002-12-17 2004-07-01 Hearts And Minds Crm Pty Ltd A method for analysing human relationships in a business network
US20120297330A1 (en) * 2011-05-17 2012-11-22 Flexigoal Inc. Method and System for Generating Reports
US11157858B2 (en) 2018-11-28 2021-10-26 International Business Machines Corporation Response quality identification

Also Published As

Publication number Publication date
WO2002065232A3 (en) 2004-03-25
AU2002251719A1 (en) 2002-08-28
WO2002065232A8 (en) 2003-10-30

Similar Documents

Publication Publication Date Title
US11435889B2 (en) System and method for building and managing user experience for computer software interfaces
US6647390B2 (en) System and methods for standardizing data for design review comparisons
AU2018255335B2 (en) Artificially intelligent system employing modularized and taxonomy-base classifications to generated and predict compliance-related content
US6853975B1 (en) Method of rating employee performance
US20180075554A1 (en) System and interface for generating real-time regulatory compliance alerts using modularized and taxonomy-based classification of regulatory obligations
US7624341B2 (en) Systems and methods for searching and displaying reports
US20070192724A1 (en) Method and Apparatus for Custom Display of 3-D Information in Reporting
US20020138297A1 (en) Apparatus for and method of analyzing intellectual property information
AU2014318392B2 (en) Systems, methods, and software for manuscript recommendations and submissions
US20030172082A1 (en) Method and system for accessing action item information
US20030229553A1 (en) Automated online underwriting
WO2004081822A1 (en) Data registration/search support device using a keyword
EP1814048A2 (en) Content analytics of unstructured documents
JP4983028B2 (en) Financial control support program and financial control support system
US20110154293A1 (en) System and method to identify product usability
WO2002065232A2 (en) Method and system for facilitating assessments
JP2003022287A (en) Design review supporting device and method and program and computer readable recording medium with design review supporting program recorded thereon
US7440934B2 (en) Method and system for decomposing and categorizing organizational information
US20070244840A1 (en) System and Method for Enabling Seekers to Create and Post Challengers for Solvers
US20030195754A1 (en) Product deviation request tracking system
US20090187438A1 (en) Method for review appraisals
US20230020047A1 (en) Systems and methods for managing a database storing clauses
EP1207483A2 (en) System and process to electronically categorize and access resource information
JP2022078483A (en) Daily report management device and computer program
Xiao et al. Development of an online supplier selection module

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

CFP Corrected version of a pamphlet front page
CR1 Correction of entry in section i

Free format text: IN PCT GAZETTE 34/2002 DUE TO A TECHNICAL PROBLEM AT THE TIME OF INTERNATIONAL PUBLICATION, SOME INFORMATION WAS MISSING (81). THE MISSING INFORMATION NOW APPEARS IN THE CORRECTED VERSION.

Free format text: IN PCT GAZETTE 34/2002 DUE TO A TECHNICAL PROBLEM AT THE TIME OF INTERNATIONAL PUBLICATION, SOME INFORMATION WAS MISSING (81). THE MISSING INFORMATION NOW APPEARS IN THE CORRECTED VERSION.

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP