WO2008054403A2 - Systems and methods for identifying, categorizing, quantifying and evaluating risks - Google Patents

Systems and methods for identifying, categorizing, quantifying and evaluating risks Download PDF

Info

Publication number
WO2008054403A2
WO2008054403A2 PCT/US2006/044228 US2006044228W WO2008054403A2 WO 2008054403 A2 WO2008054403 A2 WO 2008054403A2 US 2006044228 W US2006044228 W US 2006044228W WO 2008054403 A2 WO2008054403 A2 WO 2008054403A2
Authority
WO
WIPO (PCT)
Prior art keywords
asset
level
risk
sig
matrix
Prior art date
Application number
PCT/US2006/044228
Other languages
French (fr)
Other versions
WO2008054403A9 (en
WO2008054403A3 (en
Inventor
Paul J. Frankel
Richard E. De Francesco
Robert K. Gardner
Original Assignee
Probity Laboratories, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Probity Laboratories, Llc filed Critical Probity Laboratories, Llc
Publication of WO2008054403A2 publication Critical patent/WO2008054403A2/en
Publication of WO2008054403A9 publication Critical patent/WO2008054403A9/en
Publication of WO2008054403A3 publication Critical patent/WO2008054403A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/08Insurance

Definitions

  • the present invention relates to risk analysis and management, and more particularly to systems and methods for identifying, quantifying and evaluating risks in various contexts and for various purposes.
  • an asset can be analyzed into its various levels of sub-assets in a top-down manner.
  • lowest level sub-assets can be analyzed into components and elements of such components.
  • comprehensive and orthogonal threat probability and vulnerability data can be input for each of the elements of each component of each lowest level sub-asset.
  • such data can be input in the form of a threat probability matrix and a vulnerability matrix.
  • the input data can then be processed to generate an output set for each such sub-asset comprising a combined threat/vulnerability matrix, an index of overall risk vulnerability, or "Figure of Merit" (FOM) and associated retained risk.
  • FOM Figure of Merit
  • For each component and level of sub-assets such an output set can then be processed into combined output sets for the higher-level assets of which they are a part, proceeding back up the asset analysis tree.
  • This can provide an accurate risk calculus for the top-level asset and each level of sub-asset identified in the top-down analysis.
  • outputs can be displayed in various display modes, and an optional iterative risk remediation process can also be performed.
  • a risk calculus can be used to augment, maximize or exploit an adversary's vulnerabilities.
  • Fig. 1 depicts a high-level exemplary process flow according to exemplary embodiments of the present invention
  • Fig. 2 depicts a detailed process flow for asset analysis according to an exemplary embodiment of the present invention
  • Fig. 3 depicts a detailed process flow for computation of Special Interest Group ("SIG") Values and FOM at an individual SIG level according to an exemplary embodiment of the present invention
  • Fig. 4 depicts a detailed process flow for computation of Enterprise Level Gradient Matrix
  • Fig. 5 depicts an exemplary process flow for iterative risk remediation according to an exemplary embodiment of the present invention
  • Fig. 6 illustrates an exemplary asset analysis for an exemplary transportation sector according to an exemplary embodiment of the present invention
  • Fig. 7 illustrates an exemplary asset analysis for an exemplary pharmaceutical sector according to an exemplary embodiment of the present invention
  • Fig. 8 illustrates an exemplary asset analysis for an exemplary automotive sector according to an exemplary embodiment of the present invention
  • Fig. 9 illustrates an exemplary asset analysis for an exemplary enemy targets scenario according to an alternative exemplary embodiment of the present invention
  • Figs. 10 — 17 illustrate exemplary threat probability matrices for each defined component of an exemplary SIG according to an exemplary embodiment of the present invention
  • Figs. 18 — 25 illustrate exemplary threat vulnerability matrices for each defined component of an exemplary SIG according to an exemplary embodiment of the present invention
  • Figs. 26 - 33 illustrate exemplary weighted value arrays for each defined component of an exemplary SIG according to an exemplary embodiment of the present invention
  • Figs. 34 — 41 illustrate exemplary build coefficient matrices for each defined component of an exemplary SIG according to an exemplary embodiment of the present invention
  • Figs. 42 and 43 illustrate exemplary component-level build coefficient matrices for each defined component of an exemplary SIG according to an exemplary embodiment of the present invention
  • Fig. 44 illustrates an exemplary SIG-Level Build Coefficient Matrix according to an exemplary embodiment of the present invention
  • Fig. 45 illustrates an exemplary SIG Matrix according to an exemplary embodiment of the present invention
  • Fig. 46 is a graphic presentation of the exemplary SIG Matrix of Fig. 44 and the FOM derived therefrom;
  • Fig. 47 is the SIG Matrix of Fig. 45 as used in an exemplary enterprise level calculation according to an exemplary embodiment of the present invention
  • Fig. 48 depicts an exemplary SIG Matrix for an Amtrak Trenton Station SIG according to an exemplary embodiment of the present invention
  • Fig. 49 is a graphic presentation of the exemplary SIG Matrix of Fig. 48 and the FOM derived therefrom;
  • Fig. 50 depicts an exemplary Real Value Factors Matrix for an Amtrak 30 th Street Station SIG according to an exemplary embodiment of the present invention
  • Fig. 51 depicts an exemplary Proportional Real Value Factors Matrix for the Amtrak 30 th Street
  • Fig. 52 depicts an exemplary Real Value Factors Matrix for the Amtrak Trenton Station SIG according to an exemplary embodiment of the present invention
  • Fig. 53 depicts an exemplary Proportional Real Value Factors Matrix for the Amtrak
  • Fig. 54 depicts an exemplary Consolidated Real Value Factors Matrix for the Amtrak N.E.
  • Corridor enterprise according to an exemplary embodiment of the present invention
  • Fig. 55 depicts an exemplary FOM Build Coefficients Matrix for the Amtrak 30 th Street Station
  • Fig. 56 depicts an exemplary FOM Build Coefficients Matrix for the Amtrak Trenton Station
  • Fig. 57 depicts an exemplary FOM Consolidated Build Coefficients Matrix for the Amtrak N.E.
  • Corridor enterprise according to an exemplary embodiment of the present invention
  • Fig. 58 depicts an exemplary Enterprise Level Gradient Matrix for the Amtrak N.E. Corridor enterprise according to an exemplary embodiment of the present invention
  • Fig. 59 is a graphic presentation of the exemplary enterprise matrix of Fig. 58 and the FOM derived therefrom;
  • Fig. 60 depicts overall process flow for an exemplary Amtrak Level 3 Asset comprising an exemplary Amtrak N.E. Corridor enterprise, from data entry for the respective components of the exemplary Amtrak N.E. Corridor enterprise through outputs at the Amtrak Level 3 Asset;
  • Figs. 61-63 depict in greater detail the overall process flow of Fig. 60;
  • Fig. 64 illustrates an iterative risk remediation process according to an exemplary embodiment of the present invention.
  • Fig. 65 illustrates an exemplary 2-D display of a SIG matrix according to an exemplary embodiment of the present invention.
  • a user can, for example, enter threat and vulnerability data for an asset into a 12x6 matrix.
  • a 12x6 matrix in exemplary embodiments of the present invention, is the primary data structure by which threat and vulnerability data can be entered, stored and thereafter processed to quantify risk values.
  • FIG. 1 depicts an overall (high-level) process flow according to an exemplary embodiment of the present invention.
  • Figs. 2 through 5 then illustrate in greater detail the functionalities depicted in Fig. 1. With reference to these figures, an exemplary embodiment of the present invention will be next described.
  • Exemplary embodiments of the present invention can be, for example, implemented in a software program or set of interrelated programs using known techniques. Such programs can be provided in software, in hardware or in any combination thereof.
  • an input and user interface module which allows a user to import data files or to input data in real time, as well as to input display and processing parameters and generally exercise control.
  • Such a module could, for example, prompt a user for inputs and provide online help to assist him with data input and control of processing.
  • Such a module could also, for example, allow a user to choose output parameters for display and printing functionalities.
  • an output module which can, for example, generate displays in various formats, send output results to a printer or other peripheral device, and/or export files and data to one or more destinations.
  • a top-down asset analysis can first be performed.
  • a top-down asset analysis is depicted, for example, in Figs. 6-9.
  • Fig. 6, for example is directed to an exemplary economic sector entitled "Transportation.”
  • such a sector can be considered as a top level or Level 1 asset.
  • Proceeding in a top-down approach, such a transportation sector can be divided, for example, into Level 2 assets such as, for example, Waterway, Freight, Rail, Metro, Air, and any others that may be relevant.
  • such a Rail sub-sector (a Level 2 asset) can be further subdivided into, for example, Amtrak railways and CSX Railways.
  • each of Amtrak and CSX (which are Level 3 assets) can be further divided into Level 4 assets.
  • Amtrak can be divided into a N.E. Corridor line and a CA Coastal line
  • CSX Railways can be divided into Ch-Norfolk and All Others.
  • the N.E. Corridor line Level 4 asset can be further divided into two component stations, the 30 th Street Station and the Trenton Station.
  • threat probability and vulnerability data presented, for example, in the 12x6 threat probability and vulnerability matrix structures described below, can be entered for the various components of each of the 30 th Street Station and Trenton Station.
  • a small subset of the components of the 30 th Street Station can be identified as Ticketing and Structure.
  • the component Ticketing includes all aspects of the Ticketing system used at the 30 th Street Station, most of which is primarily computer based and involves electronic communications.
  • a Structure component of the 30 th Street Station can also be identified. Structure includes tangible objects, such as buildings, people, tracks, platforms, etc., that can be found in the physical world of 30 th Street Station. A similar division can be made for the Trenton Station.
  • the 30 th Street Station and the Trenton Station are termed Special Interest Groups, or SIGs.
  • a SIG is the smallest division of a top-level asset which has a meaningful asset value for analysis.
  • a SIG as described above, can be considered to have components, such as Ticketing and Structure, and each component can in turn have a number of elements.
  • the cyber component Ticketing can have elements such as Data, Network, Communications, Computers, Software, and others.
  • the physical component of the 30 th Street Station identified as Structure can, for example, have as elements People and Platforms.
  • SIGs can be divided into components and those components further divided into their constitutive elements as may be best appropriate in a given context.
  • the asset level which SIGs are sub-divisions of e.g., N.E. Corridor
  • entity e.g., N.E. Corridor
  • a threat probability and threat vulnerability matrix can be defined for each of the elements of each SIG.
  • Figs. 10 through 17 are exemplary threat probability matrices for each of the elements of the SIG components Ticketing and Structure.
  • Figs. 18 through 25 contain exemplary raw vulnerability data for each of the five elements in Ticketing as well as each of the three elements in Structure.
  • each SIG the raw threat and vulnerability data for each of the elements in each of the components of each SIG can be processed.
  • the output of such processing is an overall vulnerability matrix at the SIG level, sometimes referred to as a "SIG Matrix," an associated Figure of Merit ("FOM"), and a calculation of Net Risk.
  • a Figure of Merit is a measure of risk, and is used in exemplary embodiments of the present invention to quantify the risk, or more precisely, the fraction of the asset value is vulnerable.
  • the Net Risk of an asset such as, for example, a SIG, can be calculated. The calculation and significance of an FOM and Net Risk are explained in more detail below, with reference to Figs. 3 and 4.
  • Fig. 46 depicts an exemplary 3-D display of the 30 th Street Station SIG Matrix according to an exemplary embodiment of the present invention.
  • other displays can be used, such as, for example, 2-D contour maps, bar graphs, or other methods of displaying functions of two variables as are known.
  • a 2-D display can also be formed by collapsing the vertical axis and "looking" from the top down at right angles to the plane of the data points that build the 3-D display. An example of this is shown in Figure 65, which is a 2-D projection of the data presented in 3-D in Fig. 49.
  • N. E. Corridor is thus an asset of one level higher than each of the 30 th Street Station and Trenton SIGs.
  • an asset being immediately above the SIG level, can be referred to as an "enterprise” asset.
  • a threat/vulnerability matrix can also be defined at the enterprise level.
  • Such a matrix can be known, for example, as a "Gradient matrix", inasmuch as its output, a function of two variables, can mathematically be considered as a mapping from an input surface to an output surface, or as a 3D surface.
  • a Gradient matrix is thus not an independent entity, but rather, a function of all of the threat and vulnerability data for each of the elements in each of the SIGs which comprise the enterprise.
  • the Gradient matrix for N.E. Corridor can be found by combining, in a manner to be described more fully below, the SIG level threat and vulnerability matrices for each of the 30 th Street Station and Trenton Station SIGs (which comprise the N.E. Corridor).
  • an enterprise level Gradient matrix can be created, and an associated enterprise level compounded FOM can be calculated.
  • a Net Risk Value can be calculated for each SIG.
  • the objective of a risk analysis is to minimize net risk
  • the objective can be to maximize it, as is the case with an opponent's or enemy's asset such as depicted in Fig. 9.
  • these Net Risk values can further be summed to generate a total Net Risk Value for each enterprise asset.
  • Net Risk Value can be calculated for each asset up the asset analysis tree. Net Risk Value is a measure of retained risk in an asset.
  • Net Risk Values for each of the 30 th Street Station and Trenton SIGs Their sum is the Net Risk Value shown for the N.E. Corridor enterprise.
  • Net Risk Value can be calculated by summing the Net Risk Values for the Level N + 1 assets which comprise it.
  • a risk remediation analysis can optionally be performed until a diminishing returns point is reached. This analysis shall be described more fully below and is depicted for the 30 th Street Station SIG in Fig. 60.
  • Fig. 2 is a process flow diagram for the Top-Down Asset Analysis referred to at 110 in Fig. 1.
  • Figs. 6 through 9 depict examples of the output of such a top-down asset analysis for different asset sectors.
  • the exemplary asset analysis depicted in Fig. 6. will be referred to throughout the following descriptions of process flow according to exemplary embodiments of the present invention.
  • Fig. 2 depicts an exemplary process flow for resolving an asset into its constituent parts.
  • a number L of sub-asset levels in a top-level asset to be analyzed can be identified.
  • the top-level asset is thus "Transportation.” As shown, it can be divided into three sub-sectors. The first such sub-sector is “Rail.” Rail itself can be divided into two “Level 3" assets, namely "Amtrak,” and "CSX.” Each sub-sector can be termed a "Level 2" asset.
  • Amtrak can be divided into a number of "Level 4" assets, namely "N.E. Corridor,” “Others”, and “CA Coastal.”
  • CSX the other Level 3 asset constituent of Level 2 asset “Rail”
  • Level 4 assets “Ch-Norfolk” and "All Others.”
  • the Level 4 assets are enterprise level assets.
  • an enterprise level asset is an asset at the L th level, which is subdivided into SIGs.
  • SIG is the smallest division of a top level asset which has a meaningful asset value for analysis. Focusing on the N.E. Corridor enterprise, at 210 N.E. Corridor can be divided into SIGs.
  • the SIGs that comprise the N.E. Corridor can be, for example, 30 th Street Station and Trenton Station.
  • each of the assets can be assigned an asset value known as the "At-Risk Valuation.”
  • Asset Value known as the "At-Risk Valuation.”
  • Net Risk Value can be obtained via the calculations described below, the process flow of which is illustrated in Figs. 3 through 5.
  • asset values and asset weight factors can be assigned at each asset level down to the SIG level.
  • asset values in dollars assigned at each of those levels.
  • an Asset Weight Factor is the asset value of a particular asset, sub-asset or SIG, divided by all of the other assets, sub-assets or SIGs at that asset level. For example, at the SIG level in Fig. 6 the two SIGs shown are the 30 th Street Station and Trenton Station. The 30 th Street Station SIG has an asset value of $2.5 billion and the Trenton Station SIG has an asset value of $500 million.
  • each SIG can be divided into components and weighting factors for each component can be assigned. This is illustrated, for example, in Figs. 18 through 25, for the 30 th Street Station SIG. In this example, this SIG, can be divided into two components, "Ticketing" and "Structure” and each component can be further divided into elements. These weighting factors take into account the fact that in many contexts the contribution of components within SIGs, or elements within components, to the overall risk is not equal.
  • Ticketing is assigned a weighting of 0.8 and Structure is assigned a weighting of 1.0.
  • each of the elements comprising Ticketing and Structure can also respectively each be assigned a weighting factor.
  • multiple sets of asset values and weighting factors can be stored in an exemplary system, and a user can run various risk analyses for the same input data, allowing such a user to obtain a multi-dimensional view of the overall risk situation.
  • the conceptual plane for threats can be divided into twelve categories. These categories comprise six cyber categories and six physical
  • the twelve categories are believed to comprehensively describe the various possible types of threats which any asset, or subdivision thereof, faces. Moreover, these categories describe such threats in an orthogonal or independent way where no category depends upon, or is significantly correlated with, any other category.
  • a threat taxonomy By using such an exemplary threat taxonomy, a risk analyst is guided to focus in on the various separate threats which any asset faces.
  • the following table contains such an exemplary threat taxonomy.
  • Cyberhacker Classical: non-malicious hacker actions; mischief; professional, serious, occasional, amateur, lucky, etc.; operates remotely on a network, does not require proximity; intercepts un-encoded transmission for recording; causes copy to be issued or sent electronically, reverse engineering
  • PHT Physical Cracker/(Hacker) Lock picker, safe cracker, penetrator, impostor: just to prove it (PHT) can be done or for collateral reason, Peeping torn, voyeur, paparazzi, browsing real physical asset elements in containers, dumpster diving, exploiting "loose lips", curiosity or mischief; pilfering, reverse engineering
  • Cyber Terrorist Destruction, subversion, denial of service with malicious intent against cyber elements of an asset; hired agent, angry or motivated current or former employee; directed EMP, co- opted/pressured employee; intercepts un-encoded transmission for recording, manipulation or pilfering Physical Terrorist (PTT) Destruction, vandal, penetration, subversion, sabotage against physical elements of an asset; hired agent, angry or motivated current or former employee; co-opted/pressured employee; exploiting "loose lips"
  • CST Physical Spy (CST) Classical; professional, serious, or amateur, etc. engaged in surveillance ("loose lips”), reconnaissance, trespassing, planting devices, dumpster diving, reverse engineering, tampering/subversion of security protections; hired agent, angry or motivated current or former employee; co-opted/pressured employee; reverse engineering
  • Cyber criminal Theft of information, uploads or damaging code, intentional and malicious cyber activities, hired agent, angry or motivated current or former employee; co-opted/pressured employee, dollar or commodity driver, expert skills available
  • PCT Physical Criminal
  • Cyber Environmental Electrostatic shock to cyber electronics, collateral EMP, ingress of RF into cyber based processes like SCADA or sensor monitoring (as from radio/TV stations, microwave links, medical equipment, cordless phones, call phone and WiFi PDA), equipment failure
  • a user can, for example, be prompted at a threat grid entry screen to enter the probability of each of the twelve identified types of threats as to each of six identified independent types of vulnerabilities.
  • Such vulnerabilities are also designed to be orthogonal as well as comprehensive.
  • vulnerabilities can be viewed from independent or orthogonal perspectives, in various vulnerability categories.
  • Each such category is intended to be an intrinsic property of an asset, which is independent of its other properties. Avoiding overlap makes a risk analysis more clear and the remedies more evident as to purpose.
  • the following vulnerability categories have been adapted for a much broader range of threat and vulnerability possibilities that can put an asset at risk. If, given changing technologies, economies and political systems, at some point new threat attributes arise that are truly independent from the existing set, new vulnerability value panels can be created. Thus, the taxonomy is scalable.
  • vulnerability classes are seen as intrinsic properties of an asset itself. Exemplary specific definitions that can be used for each vulnerability category are as follows:
  • the above provided threat and vulnerability definition tables provide an analyst or other user with a perspective that can be used for entering data in each cell in respective Threat and Vulnerability grids.
  • Threat values can be normally provided by outside sources.
  • Vulnerability values can either be provided by analysts using templates provided by an exemplary application according to the present invention or from external sources. Such analysts can, for example, utilize independent measures, such as, for example, actuarial and handbook failure data. Such analysts can, for example, glean vulnerabilities for cyber as well as physical assets form industry sources.
  • a given software program can, for example, allow a user to immediately access the definition and some common examples of the various types of threats and vulnerabilities by, for example, clicking on the main threat or vulnerability.
  • a given system could allow a user to press a button or right-click on a mouse within a particular cell in a threat vulnerability matrix and thereby bring up the definitions of the relevant threat and the relevant intersecting vulnerability as well as seek common examples of that type of threat and that type of vulnerability so as to be better able to enter threat and vulnerability data.
  • each "alternate term set" row depicts a conventional alternate term set for categorizing the vulnerability conceptual plane.
  • the cells with asterisks are typical groupings of conventional terms that either do not cover the full range of vulnerabilities or are not independent of each other.
  • Figs. 10 through 17 depict exemplary threat probability matrices for each of the exemplary twelve threat types and six vulnerability types described above.
  • threat probabilities can have any value from 0 to 1 which correlates to a percentage of likely occurrence between 0% and 100%.
  • the range between 0 and 1 could be expressed using three or more decimal places, in which case the range would not be construed as percentages but rather as a probability scale corresponding to the number of decimal places used.
  • threats could be expressed as having a probability of X in a thousand (using three decimal places), Y in a million (using six decimal places), etc., depending upon the number of decimal places allowed in a given embodiment for the entry of threat probability data.
  • FIG. 10 depicts element level Threat Probability Matrices, where a separate matrix is input for each of the respective eight elements in the two components of the 30 th Street Station.
  • Figs. 10 through 17 depict element level Threat Probability Matrices, where a separate matrix is input for each of the respective eight elements in the two components of the 30 th Street Station.
  • the 30 th Street Station SIG can be divided into two components, "Ticketing” and “Structure.”
  • the component “Ticketing” can be further divided into five elements, namely "Data,” “Network,” “Comms,” “Computers,” and “Software,” and the component “Structure” can be further divided into three elements, namely "Building,” “People,” and “Platforms.”
  • This schema will be used in the following description to illustrate the entry ⁇ and processing of data in exemplary embodiments of the present invention. IV. SIG LEVEL PROCESSING
  • a user can input threat and vulnerability source data for each element of each SIG component.
  • Figs. 10 through 17 are exemplary Threat Probability Matrices
  • Figs. 18 through 25 are exemplary Vulnerability Matrices for each of the elements comprising the 30 th Street Station SIG.
  • the cell values in the matrices of Figs. 18 through 25, as is the case with entries in any vulnerability matrix in to exemplary embodiments of the present invention, are on a scale of 1 through 15. These numbers are actually negative exponents.
  • this scale can be chosen, in exemplary embodiments of the present invention, to represent the likelihood or probability of occurrence of a given threat against a particular vulnerability.
  • Vulnerability Matrix corresponds to the odds of occurrence expressed as 1 in 10 (Ce " Value) . Accordingly, a cell value of 1 in such a matrix means that there is a likelihood of 1 in 10 that that particular vulnerability could occur in the presence of a relevant threat, 2 nd a value of 15 is interpreted to mean that the likelihood is 1 in 10 15 that such a vulnerability could occur. Obviously, because these numbers are actually negative exponents, the higher the number the better protection there is for that vulnerability; i.e., the lower the risk that a particular threat/vulnerability combination poses to the asset in question.
  • this data can then, for example, begin to be processed according to the methods of the present invention.
  • the risk probability can be decomposed into two key components: threat probability and vulnerability probability.
  • these probabilities can be combined into a compounded value, for ease of calculation.
  • a Compounded Source Cell Value can be calculated for each element, as shown in Figs. 18 through 25. This involves combining the data in Figs. 10 through 17 with that of Figs. 18 through 25, respectively, according to the following rule:
  • CSCV Compounded Source Cell Value
  • the vulnerability values are logs (exponents) and the threat probability values are non-exponent (i.e., value from 0 to 1), it is necessary to form a resultant exponent.
  • a threat probability matrix value can be converted to a logarithm and added to, the associated vulnerability matrix cell value. Because the compounded threat-vulnerability cell values are thus actually negative exponents, they get more negative to represent a smaller number. The unsigned value then increases as that cell's value is made less contributing. It is assumed that the threat probabilities can have different values for each type of vulnerability and threat type, i.e., for each cell in the threat probability matrix.
  • the Cell Value for the Compounded Threat- Vulnerability Matrix is the Cell Value of the Vulnerability Matrix, in this case 6, added to LOG [1 /Threat Probability Cell Value].
  • the Threat Input Cell Value 1
  • the Threat Probability Matrices i.e., Figs. 10 through 17, were defined such that there is 100% probability of occurrence of each of the relevant threats against each of the identified vulnerabilities except in the case of cyber threats against people and physical structures wherein the example shows a nil threat. Because in such a case the formula would become indeterminate the exemplary application has a Boolean logic override for the entry of a "0" which makes the corresponding cell value N/A (not applicable).
  • Compounded Threat- Vulnerability Matrices will be identical to the Threat Vulnerability Matrices as is shown in Figs. 18 through 22.
  • the application uses a Boolean override to set the value to N/ A to avoid reducing the equation at 310 to an indeterminate expression.
  • the Compounded Threat- Vulnerability Matrix Data for each element can, in exemplary embodiments of the present invention, be transformed to Weighted Value Matrices at 320.
  • Weighted Value Matrices implicate the importance weighting factors assigned at the elemental and componental Levels as is illustrated in Fig. 2, at 230 and 240, and in Figs. 6-9. Accordingly, with reference to Figs. 18 through 25, as between the two components of 30 th Street Station Ticketing and Structure, Ticketing has a component weighting of 0.8 and Structure has a component weighting of 1.0. Similarly, within the Ticketing component at the elemental level, the following elements of component Ticketing have the following weightings: Data 1.0, Network 0.9, Comms 0.9, Computers 0.8 and Software 0.6. Within component Structure, the following elements have the following weightings: Building 0.8, People 1.0, and Platforms 0.7. In exemplary embodiments of the present invention the weightings are user-assigned relative measures of importance or contribution and thus there is no requirement that they sum to unity or any other fixed number.
  • each of the elemental and componental levels can be, for example, used to transform the Compounded Threat- Vulnerability Matrix Data at 310 Weighted Value Matrix Data at 320. This is illustrated at 320 all with respect to Fig. 3.
  • each of the Compounded Threat- Vulnerability Matrices such as are shown, for example, in Figs. 18 through 25, can be, for example, processed using the following equation:
  • WCV Weighted Cell Value
  • Compunded Threat- Vulnerability Matrices of Figs. 18 through 25 can be transformed to the Weighted Value Matrices of Figs. 26 through 33, respectively.
  • the Weighted Value Matrices can, for example, be transformed to create Build Coefficient Matrices.
  • Exemplary Build Coefficient Matrices can be generated by processing the Weighted Value Matrices of Figs. 26 through 33 to yield those shown, for example, in Figs. 34 through 41, respectively.
  • the relationship between a Weighted Value Matrix for a given element of a given component of a given SIG, and the corresponding Build Coefficient Matrix ("BCM”) for that element can be expressed via the equation
  • Build Coefficient Cell Value 1- 10 ( WCV) as shown at 330.
  • the reason for this transformation is that Build Coefficients are patterned after reliability factors.
  • a failure rate is one part in one thousand (.001) then a corresponding reliability factor is (1 minus the failure rate) or 0.999, for example.
  • a Weighted Value Matrix cell value is essentially a negative exponent of 10, illustrating or expressing a vulnerability to a given threat. Therefore, a higher number corresponds to a more negative power of 10 and thus a larger denominator and an overall smaller vulnerability number.
  • the various Build Coefficient Matrices for each component can be combined to generate a Component Level Build Coefficient Matrix by cellwise multiplication.
  • Figs. 34, 35, 36, 37, and 38 can be combined via cellwise multiplication to generate a component level Build Coefficient Matrix as shown in Fig. 42 for the component "Ticketing" of the 30 th Street Station SIG.
  • Figs. 39, 40, and 41 can be combined via cellwise multiplication to generate a component-level Build Coefficient Matrix as shown in Fig. 43 for the component "Structure”.
  • Fig. 43 illustrates an exemplary Compounded Structure Component Build Coefficient Matrix, generated by cellwise multiplication across the matrices shown in Figs.
  • Fig. 44 illustrates an exemplary SIG Level Build Coefficient Matrix which perpetuates the values for cyber threats of Fig. 42, the Compounded Ticketing Component Build Coefficient Matrix, unchanged.
  • a SIG Level Build Coefficient Matrix can be generated by cellwise multiplication across the various Compounded Component Build Coefficient Matrices. In the illustrative example this entails cellwise multiplication of Figs. 42 and 43. However, as noted above, any cell which has no value is assigned a value of 1.
  • Fig. 42 which are those of the Compounded Ticketing Component Build Coefficient Matrix.
  • Fig. 44 is thus the product of cellwise multiplying the matrices of Figs. 42 and 43, and is thus a SIG Level Build Coefficient Matrix incorporating the data from both the Ticketing and Structure components of the 30 th Street Station SIG.
  • a SIG Level Build Coefficient Matrix for example, can be transformed to a SIG Matrix, i.e., a Gradient matrix for the SIG.
  • a SIG Matrix i.e., a Gradient matrix for the SIG.
  • each cell in the SIG Level Build Coefficient Matrix is transformed using the formula:
  • This mathematical transformation takes a build coefficient reliability factor type cell value and transforms it back to a number which represents a negative exponent of 10 to express the opposite of vulnerability or "index of protection", where the lesser vulnerability the higher the number.
  • the SIG Matrix value is of the same type as was the original Vulnerability Matrices and the Compounded Threat- Vulnerability Matrices. These numbers, therefore, run on a nominal scale of 1 through 16 (or, where the threat probabilities are all 100%, on a nominal scale of 1 to 15).
  • An exemplary SIG Matrix generated from the SIG Level Build Coefficients Matrix of Fig. 44 is depicted in Fig. 45.
  • Fig. 45 is thus a mathematically created matrix representing the overall vulnerability of the 30 th Street Station SIG.
  • a mathematical quantity known as a "Figure of Merit” or "FOM” can be calculated from the SIG Level Build Coefficients Matrix.
  • An FOM measures the total reliability or "invulnerability" of a SIG.
  • a FOM can be generated, for example, by multiplying all of the cells in the SLBC Matrix together, and then converting this product to an exponent (of 10) to indicate what an equivalent hybridized overall SIG Value is.
  • An FOM is somewhat analogous to a GPA score for academic performance but without the effect of high scores averaging out failures or very low scores. The non-linear FOM representation does not allow very low scores or "failures" to be hidden from view or computation.
  • an FOM can be calculated at 360 according to the formula
  • Fig. 46 is a color-coded three-dimensional depiction of the SIG Matrix Values of Fig. 45.
  • Fig. 46 also presents the FOM for this SIG, computed as described above.
  • Fig. 47 once again presents the SIG Matrix Values of Fig. 45 and additionally illustrates the Net Risk Figure for the 30 th Street Station SIG.
  • the Net Risk is a quality 10 "(S1G F0M) can be generated called the "Risk Multiplier", using the equation: n ⁇ ⁇ %•?ri%> ⁇ U V VoIlln VI ⁇ W * D I %i ⁇ o WlIXr I IWVIIi Vl il It Vll
  • a SIG Net Risk can be calculated. Accordingly, using a SIG FOM of 0.128280, a Risk Multiplier can be generated by 1(T 0 128280 , which yields 0.744252, or 74.425%.
  • a Risk Multiplier is a function of an FOM representing the fraction of the valuation of the SIG (or other asset) which is at risk given that FOM.
  • 74.425% of the total Asset Value is at risk given the current threat vulnerability data encoded in the FOM, yielding $1,860,631,568 as the net risk associated with the 30 th Street Station. It is noted that this is a very large at-risk value.
  • the 30 th Street Station SIG is a prime candidate for an optional risk remediation process, as described below with reference to Fig. 4.
  • the FOM for the SIG the SIG Matrix and the SIG Net Risk can be output "upwards" to be used in calculations at the enterprise level.
  • sub-assets at the L th asset level are divided into SIGs, and thus illustrated the N.E. Corridor, a sub-asset at the L th (4 th ) asset level, was divided into the two SIGs of the 30 th Street Station and Trenton Street Station.
  • RVFs Real Value Factors
  • STG Matrix values have been converted to real numbers in the form of RVFs, they can be linearly combined to create an Enterprise Level Gradient Matrix.
  • Such combination in exemplary embodiments of the present invention, can be linear and can, for example, be weighted by the relative value of each SIG (whether in dollar terms, quality of life terms, lives at risk, and any other convenient metric as may be useful) within the enterprise.
  • This is illustrated for the exemplary N.E. Corridor example in Figs. 47 and 48.
  • Fig. 47 shows that the at-risk valuation of the 30 th Street Station SIG is $2.5 billion.
  • the quotient defined by the at-risk valuation of a SIG divided by the total at-risk valuation of all SIGs in a given enterprise can, in exemplary embodiments of the present invention, be termed the "Risk Multiplier.” It is the Risk Multiplier that can be used to transform RVFs derived from SIG Matrices into proportional RVFs ("PRVFs") which are weighted RVFs.
  • PRVFs proportional RVFs
  • the Risk Multiplier can then be multiplied by each element in a RVF Matrix to generate a proportional RVF ("PRVF") Matrix for each SIG as shown at 420 using the formula:
  • Fig. 50 depicts an RVF matrix for the 30 th Street Station SIG
  • Fig. 51 depicts its corresponding PRVF matrix.
  • Fig. 51 was generated by multiplying each element of Fig. 50 by a RM of 83.33% (see Fig. 6)
  • Fig. 52 depicts an exemplary RVF matrix for the Trenton Station SIG
  • Fig. 53 its corresponding PRVF matrix; generated by multiplying the RVF matrix of Fig. 52 by an RM of 16.67%.
  • various proportional RVF Matrices can be combined by cellwise addition to generate a consolidated RVF Matrix, or CRFV for the enterprise. This can be done, for example, using the following formula:
  • FIG. 54 A CRVF Matrix for the exemplary N. E. Corridor enterprise is depicted in Fig. 54, and an associated ELGM is depicted in Fig. 58.
  • Fig. 59 is a multi-colored 3D graphic display of the ELGM of Fig. 58.
  • the 3D surface of Fig. 58 can be seen as a complex weighted combination of the 3D surfaces of Figs. 46 and 49. Due to the more pronounced contribution of the 30 th Street Station SIG relative to that of the Trenton Station SIG, the surface depicted in Fig. 59 appears to be more similar to the 30 th display of the 30 th Street Station SIG Matrix depicted in Fig. 46.
  • Gradient matrices at the enterprise or any higher level can also be displayed as a 2D contour map or by using any other technique to display functions of two variables as is known.
  • an Enterprise Level Build Coefficient Matrix can be generated, using the formula:
  • ELBCMjJ [1 -10 (ELGMij) ].
  • Fig. 57 depicts such an ELBCM for the exemplary N.E. Corridor enterprise.
  • ⁇ s is described above in connection with SIG level processing, a build coefficient matrix can be used to generate an FOM.
  • an enterprise level compounded FOM (“ELCFOM”) can be calculated from an ELBCM using the formula
  • Fig. 59 depicts an FOM for the exemplary N.E. Corridor enterprise and a 3D multicolored graphic display of an associated ELGM. It is noted that the FOM value of 0.173 is dangerously low, which indicates that iterative risk remediation may be a useful option, as described below.
  • the Net Risk for the enterprise can be calculated by simply summing the individual Net Risk values for each SIG in the enterprise which were output to the enterprise processing at 380, of Fig. 3.
  • the Net Risk N E c omdor Net Risk 3 o th
  • output data can be insulated from input source data so that objective numerical results can be compounded without revealing the risks of individual contributors in non-federated analyses. This can be desirable where, for example, a given asset or sector has one or more lower level sub-assets that are sovereign entities.
  • a user can optionally proceed to an iterative FOM improvement process, at 470, if the FOM is low or if it is obvious that there are certain values in the ELGM that are unacceptably low, relative to a defined value or values.
  • the ELGM value of 0.79 in the cell at the intersection of threat "Physical Terrorist” and vulnerability “Discernable” is unreasonably low.
  • the intersection of threat “Physical Environmental” and vulnerability “Accessible” of 0.53 is also unreasonably low.
  • any vulnerability score less than 2 (which correlates to a probability of occurrence greater than or equal to 1 in 100) is unacceptably low in most contexts.
  • a cell value less than 1 correlates to a probability of occurrence greater than 1 in 10, which is almost always unacceptable. If these values can be raised then perhaps the overall FOM can be raised to a more acceptable number, resulting in a lesser portion of the value of the enterprise being at risk.
  • Such an iterative FOM improvement process is illustrated in Fig. 5, and an example of such process illustrated in Fig. 64.
  • the target for remediation is any cell value below an FOM value + 1 that would produce acceptable risks.
  • an FOM value of 4 might be an acceptable risk value, so any cell with a value less than 5 could be investigated.
  • the reason for such an increase in target value for cells is because cell values are multiplied together, a product will always be lower than any individual cell value.
  • an Enterprise Level Gradient Matrix is composed of derived elements, the only way to raise an overall enterprise level FOM enterprise level is to raise the corresponding vulnerability values at the elemental level. This can be done, for example, by going back to the element level vulnerability matrices contributing to the low vulnerability value and measuring the costs of remediating those risks with low values so as to increase their values compared with the associated reduction in the proportion of the asset that is at risk as a result of the remediation.
  • remediation cost In general, as long as the remediation costs are less than the change in Net Risk Value by an acceptable ROI factor, it is worthwhile to pay that remediation cost, recalculate the Net Risk Value and the FOM, and inquire as to whether further remediation, given its cost(s), would continue to decrease the Net Risk Value of that asset by a still greater amount than the associated remediation cost. This process will next be described in detail with reference to Figs. 5 and 64.
  • the vulnerabilities in the element-level matrices that have values below the initial FOM can be selected, net risk values are collected, and remediation costs to bring those cells to a value of at least one higher than the initial FOM can be calculated.
  • Gradient calculations at the relevant SIG levels can be rerun and the reduction in retained risk values observed.
  • cost metrics can be associated with assets in exemplary embodiments of the present invention. Therefore, although for the illustrated exemplary N. E. Corridor enterprise, dollars have been used in alternative exemplary embodiment, lives at risk, or quality of life impact, as well as a variety of other cost metrics or valuation metrics associated with an asset can be used.
  • a "strategic value” may not directly correlate with dollar values but may represent the value of such an asset to an enemy in operating militarily.
  • Numerous other exemplary asset valuation metrics can be used as may be appropriate or desirable in various contexts.
  • the return on investment or ROI can be calculated using the following formula:
  • ROI Reduction in L th Level Net Risk Value / ⁇ SIG remediation costs up to the L th Level
  • processes 501, 502, and 503 can be repeated until the ROI fails to meet investment/remediation-yield criteria.
  • investment/remediation-yield criteria can be set by a user or by a special risk analyst.
  • Such an analyst could be, for example, either the same analyst that performed an asset analysis process and generated the data input with the threat and vulnerability matrices, or, for example, a different type of asset analyst sometimes known in the art as a "risk governor".
  • Fig. 5 depicts process flow for an exemplary iterative risk remediation process at the enterprise level according to an exemplary embodiment of the present invention. To illustrate such a process with actual numbers would be rather complicated, inasmuch as each element of each component of each SIG would need to be analyzed for ways to improve (i.e., increase the values of) its respective vulnerability matrix entries and the costs of such remediation calculated. Each such iteration could result in a change to the FOMs for each SIG and a resultant recalculation of the FOM for the enterprise as described in connection with Fig. 4.
  • the source of a low enterprise level FOM is a SIG with a large Asset Weight Factor that has a significantly lower FOM than the other SIGs in the enterprise.
  • the 30 th Street Station SIG fits just such a profile. It has an Asset Weighting Factor of 72.289%, and it thus contributes significantly to the overall N.E. Corridor FOM, and Us FOM is an exceeding low 0.128.
  • risk remediation at the 30 th Street Station level may be sufficient to solve the problem.
  • Fig. 64 Such a process is illustrated in Fig. 64. With reference to Fig. 64, there are seven columns labeled 6401, 6403, 6405, 6407, 6409, 6411 and 6413.
  • Each column contains a different type of metric associated with a given set of threat and vulnerability matrix values.
  • the Baseline analysis row this is the starting point for the iterative risk remediation process.
  • the FOM listed in the baseline analysis row is the same FOM provided in Figs. 46 and 47 for the 30 th Street Station, namely 0.128. As noted above, this figure is very low, and therefore the 30 th Street Station SIG is a prime candidate for an iterative risk remediation process.
  • column 6405 provides a Net Risk Multiplier associated with each FOM. As the FOMs in column 6403 increase, the Net Risk Multiplier, which represents the fraction of the assets value which is at risk, will decrease. Associated with each Net Risk Multiplier, in column 6407 is a Net Risk Value, or actual dollar value at risk. The Net Risk Value is the asset value of the asset, here the 30 th Street Station, multiplied by the Net Risk Multiplier. Column 6409 provides the remediation cost in moving from each row to the next row below.
  • An ROI greater than 1 is generally beneficial, and an ROI greater than 100 represents a significant return on the invested risk remediation costs.
  • Fig. 64 the values in the remainder of the rows in Fig. 64 can be similarly generated.
  • an iterative risk remediation process can continue until a DRP, or Diminishing Return Point, is arrived at.
  • the Diminishing Return Point is achieved when the investment cost equals, or substantially equals, the prospective reduction in retained risk, or when the ROI is substantially equal to unity.
  • the remediation cost is greater than or equal to the associated decrease in net risk value, it simply does not make economic sense to further remediate the risk. This is seen in the row entitled Next Analysis-5, where the DRP has been reached.
  • the percentage of the 30 th Street Station which is at-risk has been reduced from 74.427% to only .0015%.
  • Figs. 61 through 63 trace the processing of data from input at the component level through calculation of the FOM, Net Risk Value, and Level 3 Asset Gradient Matrix (one level above the enterprise level) which can be graphically displayed as a 3-D multicolored surface (or as any other representation of a function of two variables, as noted above, such as for example a 2D contour map, a 2D "hot spot" map, etc.) for the Level 3 Asset Amtrak.
  • Fig. 60 is a end to end depiction of this process, which originates on the far right side of Fig. 60 and terminates on the far left side of Fig. 60.
  • Figs. 61 through 63 each represent approximately one-third of the process flow depicted in Fig. 60, for ease of description and illustration.
  • Fig. 61 depicts a portion of the processing flow from the inputting of data at the SIG component level to the generation of outputs at the SIG level. This is the process flow which corresponds to Fig. 4.
  • Fig. 62 depicts the process flow starting from using as inputs the enterprise level data for each of the N.E. Corridor enterprise and the CA Coastal enterprise to the generation of Proportional PRV matrices for each of these enterprises and carrying forward the Net Risk Value for each of these enterprises.
  • the process flow depicted in Fig. 63 begins with the Proportional PRV matrices generated as shown in Fig. 62 and using this data to generate an Amtrak Level 3 Asset Gradient Matrix as well as an Amtrak FOM.
  • the Level 3 asset "Amtrak” can be divided into three enterprises, namely, the N. E. Corridor, Others, and CA Coastal.
  • the Level 3 asset Amtrak will be considered to be composed of only the N.E. Corridor and CA Coastal enterprises.
  • the N.E. Corridor enterprise is composed of two SIGs: 30 th Street Station and Trenton Station.
  • Fig. 61 beginning at the right side of the figure, data from the two SIGs comprising the N.E. Corridor can be entered and processed.
  • the 30 th Street Station SIG Matrix shown in Fig.
  • the Trenton Station SIG Matrix (shown in Fig. 50) can be input and at 6107 the Trenton Station SIG Matrix (shown in Fig. 50) can be input.
  • Each of these SIG Matrices can be processed in parallel.
  • the 30 th Street Station RVF matrix can be generated, as shown in Fig. 50.
  • the Trenton Station RVF matrix can be generated, as shown in Fig. 52.
  • these two RVF Matrices can be transformed to Proportional RVF Matrices at 6103 and 6105, respectively, as shown in Figs. 51 and 53, respectively.
  • the proportional RVF matrices 6103 and 6105 can be added cellwise at 6113 to generate a N.E. Corridor Consolidated RVF Matrix, shown in Fig. 54.
  • N.E. Corridor Consolidated RVF matrix from which the N.E. Corridor enterprise level Gradient matrix is generated at 6201.
  • the creation of FOMs will next be described.
  • the 30 th Street Station SIG Matrix is input at 6101. This matrix is shown in Fig. 47. This matrix can be transformed via the formula 1 -10 ce va ue to generate a 30 l Street Station Build Coefficients matrix at 6110, as shown in Fig. 55.
  • a 30 th Street Station FOM Intermediary Product can be generated, which is the product of all cells in the 30 l Street Station Build Coefficients matrix.
  • a 30 th Street Station FOM can be generated at 6121 by the formula
  • FOM -LOG(I-FOM Intermediary Product).
  • a 30 l Street Station Net Risk Multiplier 6122 can be generated via the operation 10 "FOM . Taking the product of the Net Risk Multiplier 6122 and the 30 th Street Valuation Input 6112 generates a Net Risk Value 6123. In the case of the 30 th Street Station, this is $1,860,667,446 (See Fig. 6).
  • the Trenton Station SIG matrix can be input, shown at Fig. 48. From this matrix, using the formula
  • a Trenton Station Build Coefficients matrix can be generated at 6114, shown at Fig. 56. From there, by taking product of all cells in that matrix, a Trenton Station FOM Intermediary Product can be generated at 6115 which can be transformed to an FOM at 6125 using the formula
  • FOM -LOG (1 -Trenton Station FOM Intermediary Product).
  • the Trenton Station FOM can then be transformed to a Net Risk Multiplier at 6126 using the formula 10 " and the Net Risk Multiplier 6126 multiplied by the Trenton Station Valuation Input 6116 can thus yield an exemplary Trenton Station Net Risk Value 6127 of $5,674,000 (see Fig. 6).
  • Fig. 62 deals primarily with processing at the N.E. Corridor enterprise level.
  • the N.E. Corridor Enterprise Level Gradient Matrix shown in Fig. 58, can be carried over from Fig. 61.
  • this ELGM can be transformed to a N.E. Corridor Consolidated Build Matrix, as shown in Fig. 57.
  • an N.E. Corridor FOM Intermediary Product can be generated, by multiplying all of the cells in the N.E. Corridor Consolidated Build Matrix (Fig. 57) together, and at 6211 a N.E. Corridor FOM can be formed from the FOM Intermediary Product 6203 using the equation
  • FOM -LOG (1 - N.E. Corridor FOM Intermediary Product).
  • the N.E. Corridor ELGM can be graphically displayed, as shown in Fig. 59, at 6210. Adding together the Net Risk Values from the 30 th Street Station (6123, Fig. 61) and the Trenton Station (6127, Fig. 61), at 6212 a Net Risk Value for the N.E. Corridor enterprise can be generated. This can be input to the next level s the Level 3 Asset Amtrak, as described below. Finally, 6230 represents all of the data relative to the CA Coastal Enterprise that was carried through from lower level computations. The processing required to generate that data is not shown, it being assumed that there is enterprise level data for the CA Coastal enterprise which was generated in an analogous manner as shown for the N.E. Corridor enterprise.
  • CA Coastal Asset Waiting Factors can be input to 6224, a CA Coastal ELGM can be input to 6225, and a CA Coastal Net Risk Value can be input to 6331, on Fig. 63.
  • N.E. Corridor Enterprise data can be input to 6225
  • CA Coastal Net Risk Value can be input to 6331, on Fig. 63.
  • the N.E. Corridor Enterprise data can be processed into a N.E. Corridor Real Value Factor Matrix at 6221. This matrix can in turn be transformed to a N.E. Corridor Proportional Real Value Factor Matrix at 6222.
  • the CA Coastal ELGM can be processed into a CA Coastal Real Value Factor Matrix 6224, and that matrix can, in turn, be processed into a CA Coastal Proportional Real Value Factor Matrix at 6223.
  • the two proportional Real Value Factor matrices at 6222 and 6223 can be respectively combined into an Amtrak Consolidated Real Value Factor Matrix at 6301, in Fig. 63.
  • This matrix can, for example, be transformed into an Amtrak Level 3 Asset Gradient Matrix at 6310 using the equation
  • Intermediary Product can be generated at 6312 by multiplying all elements of this matrix together.
  • an Amtrak FOM can be generated from such an Amtrak FOM Intermediary Product 6312 using the equation:
  • Amtrak FOM -LOG(I - Amtrak FOM Intermediary Product)
  • Net Risk Values for any net risk above the SIG levels can be calculated by linearly adding the relevant lower level Net Risk Values.
  • Net Risk Values follow a different data flow, or data path, than do the Gradient matrix and FOM calculations.
  • the N.E. Corridor Net Risk Value from 6212 in Fig. 62 can be input to Fig. 63 at 6330 and the CA Coastal Net Risk Value from 6230 in Fig. 62 can be input to Fig. 63 at 6331.
  • These two values can, for example, then be summed to generate an Amtrak Net Risk Value at 6340.
  • an Amtrak Level 3 Asset Gradient Matrix 6310 which can be displayed in the 3-D multicolored graphic 6320
  • an Amtrak FOM 6321 which is a measure of the risk vulnerability of the entire Amtrak system
  • an Amtrak Net Risk Value 6340 (sometimes referred to as retained risk).
  • risk assessment is understood in the context of having one or more assets to protect that have intrinsic vulnerabilities and potential threats.
  • the objective is to lower the vulnerabilities in the face of potential threats to lower overall risk.
  • an asset belongs to an enemy or aggressor.
  • a user knows or strongly suspects its vulnerabilities and has no intention of lowering them. In fact he may seek to maximize them.
  • exemplary embodiments of the present invention can be used to analyze the effectiveness of the threats that may be generated.
  • the data input is all the same, as described above, the data processing is the same, but the activity level to determine the input data to the threat matrices is very high and the vulnerability matrix activity level is significantly reduced.
  • an "improved" FOM is a smaller FOM, indicating a higher risk.
  • ROI calculations can, for example, follow the same process, but the intention on the input of the data is vastly different.
  • ROI can be measured as the amount of risk AUGMENTATION divided by the costs of implementing a LOWER FOM.
  • Threat values can be as granular as desired since the methods of the present invention are indifferent to granularity.
  • retained risk values can be maximized as opposed to minimized in the more conventional risk management applications as illustrated above.
  • the methods of the present invention could also be applied to analyze the "risk" - in political terms, using some meaningful metric - that an opposing candidate faces in a political campaign comprising multiple candidates.
  • vulnerabilities the weaknesses (education, record, finances, hidden information, dirt, cosmetics, speaking ability, knowledge, prior experience, etc.) expressed in likelihood of looking bad, happening or being revealed in the campaign or relative to the candidate, and for threats, the strengths (threats) of the opposition (debates, misrepresentations, dirty tricks, cosmetics, outspending, media, etc.) expressed in probability or likelihood of implementation in countering those vulnerabilities.
  • Such an analysis can be done across several boundaries (assets) geographic or demographic, as well as considering the strengths and weaknesses of multiple candidates.
  • the "what if or iterative risk remediation analysis can be used against different strategies to foil the effectiveness of the competition and/or raise the potential voting point separation between candidates (i.e., lower the retained risk).

Abstract

Systems and methods for identifying, categorizing, quantifying and evaluating risks are presented In the present invention, an Asset can be analyzed into its various levels of sub-assets in a top-down manner In turn, lowest level sub-assets can be analyzed into components and elements of such components In exemplary embodiments of the present invention, comprehensive and orthogonal threat probability and vulnerability data can be input for each of the elements of each component of each lowest level sub-asset In exemplary embodiments of the present invention such data can be input in the form of a threat probability matrix and a vulnerability matrix The input data can then be processed to generate an output set for each such sub-asset comprising a combined threat/vulnerabihty matrix, an index of overall risk vulnerability, or 'Figure of Merit' (FOM) and associated retained risk

Description

SYSTEMS AND METHODS FOR IDENTIFYING, CATEGORIZING, QUANTIFYING AND EVAUATING RISKS
TECHNICAL FIELD:
The present invention relates to risk analysis and management, and more particularly to systems and methods for identifying, quantifying and evaluating risks in various contexts and for various purposes.
BACKGROUND OF THE INVENTION:
In the post 9/11 world, i.e. , a world where there is an ongoing and global War on Terror, it is important to identify risks with particularity. Whereas in the past people did not need to spend a lot of energy and resources in identifying the full range of possible threats and intrinsic vulnerabilities to those threats of a system, an enterprise, an asset within an enterprise or a component of such an asset, it is now becoming more and more important that businesses, governments, and other institutions be able to do just that. Moreover such identification of threats and vulnerabilities needs to be done objectively and rigorously.
Not only is it necessary to conceptually identify potential risks in terms of costs of different kinds (e.g., financial, quality of life, severity of breach in security, etc.), it is also incumbent upon managers, governmental leaders, and business executives to find a framework in which to quantify such risks with vastly increased objectivity as well as to quantify the costs (again measured in various ways) of remediating some or all of the most egregious risks.
Conventional approaches concentrate on colloquially and/or vaguely stated vulnerabilities to risk as opposed to encompassing all categories intrinsic to the element at risk. Often the descriptors are not independent of each other and both overlap as well as have attributes missing. Likewise, threats that are considered are not both encompassing and independent. To treat risk exhaustively so as to avoid surprise from overlooking the more subtle but exploitable components of risk, completeness is essential. Additionally, conventional approaches use abbreviated and subjective input criteria. Often the values for a given threat are expressed in modest whole number percentages or broad categories of low, medium and high. Invariably, analyses are done by the hour and impressively large reports are prepared. While thoroughness is ostensibly provided, the objectivity of the input data and the ease and timeliness for executive evaluation are minimal.
Moreover, conventional numerics used in evaluating risks tend to be limited to simple blocks of numbers, or colors, or labels such as high, medium or low or variations thereof. One need only recall the United States Department of Homeland Security's use of a color coded terror threat system, which arguably does not convey much, if any, useful information. Rather, a risk evaluation range should be expansive and look to the potential failure of an aspect of the element in question with as much granularity as available. For example, failure can occur in parts per thousand, million or more. To achieve a net-risk valuation of $50,000 on an asset worth $5 billion requires a discrimination granularity of 100,000 to 1. This suggests that simple percentages or worse are not an effective risk management tool. Likewise, risk governance needs a tool to evaluate the cost benefits of remedies, such as, for example, a Return On Investment ("ROI") calculation instead of simply seeking to drive all vulnerabilities to the minimum.
Thus, what is needed in the art is a framework in which threats as well as the various types of vulnerabilities to those threats, all measured in terms of various asset valuation metrics, can be identified comprehensively without overlap, and quantified in terms of such valuation metrics to better manage risk.
SUMMARY OF THE INVENTION:
Systems and methods for identifying, categorizing, quantifying and evaluating risks are presented. In exemplary embodiments of the present invention an asset can be analyzed into its various levels of sub-assets in a top-down manner. In turn, lowest level sub-assets can be analyzed into components and elements of such components. In exemplary embodiments of the present invention, comprehensive and orthogonal threat probability and vulnerability data can be input for each of the elements of each component of each lowest level sub-asset. In exemplary embodiments of the present invention such data can be input in the form of a threat probability matrix and a vulnerability matrix. The input data can then be processed to generate an output set for each such sub-asset comprising a combined threat/vulnerability matrix, an index of overall risk vulnerability, or "Figure of Merit" (FOM) and associated retained risk. For each component and level of sub-assets such an output set can then be processed into combined output sets for the higher-level assets of which they are a part, proceeding back up the asset analysis tree. This can provide an accurate risk calculus for the top-level asset and each level of sub-asset identified in the top-down analysis. In exemplary embodiments of the present invention, such outputs can be displayed in various display modes, and an optional iterative risk remediation process can also be performed. In alternative "inverse" exemplary embodiments of the present invention a risk calculus can be used to augment, maximize or exploit an adversary's vulnerabilities. BRIEF DESCRIPTION OF THE DRAWINGS:
Fig. 1 depicts a high-level exemplary process flow according to exemplary embodiments of the present invention;
Fig. 2 depicts a detailed process flow for asset analysis according to an exemplary embodiment of the present invention;
Fig. 3 depicts a detailed process flow for computation of Special Interest Group ("SIG") Values and FOM at an individual SIG level according to an exemplary embodiment of the present invention;
Fig. 4 depicts a detailed process flow for computation of Enterprise Level Gradient Matrix and
Compounded FOM according to an exemplary embodiment of the present invention;
Fig. 5 depicts an exemplary process flow for iterative risk remediation according to an exemplary embodiment of the present invention;
Fig. 6 illustrates an exemplary asset analysis for an exemplary transportation sector according to an exemplary embodiment of the present invention;
Fig. 7 illustrates an exemplary asset analysis for an exemplary pharmaceutical sector according to an exemplary embodiment of the present invention;
Fig. 8 illustrates an exemplary asset analysis for an exemplary automotive sector according to an exemplary embodiment of the present invention;
Fig. 9 illustrates an exemplary asset analysis for an exemplary enemy targets scenario according to an alternative exemplary embodiment of the present invention;
Figs. 10 — 17 illustrate exemplary threat probability matrices for each defined component of an exemplary SIG according to an exemplary embodiment of the present invention; Figs. 18 — 25 illustrate exemplary threat vulnerability matrices for each defined component of an exemplary SIG according to an exemplary embodiment of the present invention;
Figs. 26 - 33 illustrate exemplary weighted value arrays for each defined component of an exemplary SIG according to an exemplary embodiment of the present invention;
Figs. 34 — 41 illustrate exemplary build coefficient matrices for each defined component of an exemplary SIG according to an exemplary embodiment of the present invention;
Figs. 42 and 43 illustrate exemplary component-level build coefficient matrices for each defined component of an exemplary SIG according to an exemplary embodiment of the present invention;
Fig. 44 illustrates an exemplary SIG-Level Build Coefficient Matrix according to an exemplary embodiment of the present invention;
Fig. 45 illustrates an exemplary SIG Matrix according to an exemplary embodiment of the present invention;
Fig. 46 is a graphic presentation of the exemplary SIG Matrix of Fig. 44 and the FOM derived therefrom;
Fig. 47 is the SIG Matrix of Fig. 45 as used in an exemplary enterprise level calculation according to an exemplary embodiment of the present invention;
Fig. 48 depicts an exemplary SIG Matrix for an Amtrak Trenton Station SIG according to an exemplary embodiment of the present invention;
Fig. 49 is a graphic presentation of the exemplary SIG Matrix of Fig. 48 and the FOM derived therefrom;
Fig. 50 depicts an exemplary Real Value Factors Matrix for an Amtrak 30th Street Station SIG according to an exemplary embodiment of the present invention; Fig. 51 depicts an exemplary Proportional Real Value Factors Matrix for the Amtrak 30th Street
Station SIG according to an exemplary embodiment of the present invention;
Fig. 52 depicts an exemplary Real Value Factors Matrix for the Amtrak Trenton Station SIG according to an exemplary embodiment of the present invention;
Fig. 53 Fig. 52 depicts an exemplary Proportional Real Value Factors Matrix for the Amtrak
Trenton Station SIG according to an exemplary embodiment of the present invention;
Fig. 54 depicts an exemplary Consolidated Real Value Factors Matrix for the Amtrak N.E.
Corridor enterprise according to an exemplary embodiment of the present invention;
Fig. 55 depicts an exemplary FOM Build Coefficients Matrix for the Amtrak 30th Street Station
SIG according to an exemplary embodiment of the present invention;
Fig. 56 depicts an exemplary FOM Build Coefficients Matrix for the Amtrak Trenton Station
SIG according to an exemplary embodiment of the present invention;
Fig. 57 depicts an exemplary FOM Consolidated Build Coefficients Matrix for the Amtrak N.E.
Corridor enterprise according to an exemplary embodiment of the present invention;
Fig. 58 depicts an exemplary Enterprise Level Gradient Matrix for the Amtrak N.E. Corridor enterprise according to an exemplary embodiment of the present invention;
Fig. 59 is a graphic presentation of the exemplary enterprise matrix of Fig. 58 and the FOM derived therefrom; and
Fig. 60 depicts overall process flow for an exemplary Amtrak Level 3 Asset comprising an exemplary Amtrak N.E. Corridor enterprise, from data entry for the respective components of the exemplary Amtrak N.E. Corridor enterprise through outputs at the Amtrak Level 3 Asset;
Figs. 61-63 depict in greater detail the overall process flow of Fig. 60; Fig. 64 illustrates an iterative risk remediation process according to an exemplary embodiment of the present invention; and
Fig. 65 illustrates an exemplary 2-D display of a SIG matrix according to an exemplary embodiment of the present invention.
It is noted that the patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawings will be provided by the U.S. Patent Office upon request and payment of the necessary fees.
DETAILED DESCRIPTION OF THE INVENTION: I. PROCESS FLOW OVERVIEW
Given the unique taxonomy described above, a user can, for example, enter threat and vulnerability data for an asset into a 12x6 matrix. Such an exemplary 12x6 matrix, in exemplary embodiments of the present invention, is the primary data structure by which threat and vulnerability data can be entered, stored and thereafter processed to quantify risk values.
Fig. 1 depicts an overall (high-level) process flow according to an exemplary embodiment of the present invention. Figs. 2 through 5 then illustrate in greater detail the functionalities depicted in Fig. 1. With reference to these figures, an exemplary embodiment of the present invention will be next described.
Exemplary embodiments of the present invention can be, for example, implemented in a software program or set of interrelated programs using known techniques. Such programs can be provided in software, in hardware or in any combination thereof. There could be, for example, an input and user interface module which allows a user to import data files or to input data in real time, as well as to input display and processing parameters and generally exercise control. Such a module could, for example, prompt a user for inputs and provide online help to assist him with data input and control of processing. Such a module could also, for example, allow a user to choose output parameters for display and printing functionalities. There could additionally be one or more processing modules where the input data is processed as described below. Finally, there could be an output module which can, for example, generate displays in various formats, send output results to a printer or other peripheral device, and/or export files and data to one or more destinations.
Overall processing flow will next be described with reference to Figs. 1-5. With reference to Fig. 1 , at 110, a top-down asset analysis can first be performed. A top-down asset analysis is depicted, for example, in Figs. 6-9. Fig. 6, for example, is directed to an exemplary economic sector entitled "Transportation." In exemplary embodiments of the present invention, such a sector can be considered as a top level or Level 1 asset. Proceeding in a top-down approach, such a transportation sector can be divided, for example, into Level 2 assets such as, for example, Waterway, Freight, Rail, Metro, Air, and any others that may be relevant. For illustrative purposes, what will be used in the following description as a continuing illustrative example is an exemplary asset which falls under the Rail sub-sector of the Transportation sector. This example is figurative and constructed only for illustrative purposes. However, it is designed to showcase certain real world attributes of such railway assets and how they can be modeled and analyzed as to risks according to the methods of the present invention.
As seen in Fig. 6, such a Rail sub-sector (a Level 2 asset) can be further subdivided into, for example, Amtrak Railways and CSX Railways. As can be further seen in Fig. 6, each of Amtrak and CSX (which are Level 3 assets) can be further divided into Level 4 assets. Accordingly, Amtrak can be divided into a N.E. Corridor line and a CA Coastal line, and CSX Railways can be divided into Ch-Norfolk and All Others. Finally, for example, the N.E. Corridor line Level 4 asset can be further divided into two component stations, the 30th Street Station and the Trenton Station.
Continuing with reference to Fig. 1, at 120 threat probability and vulnerability data, presented, for example, in the 12x6 threat probability and vulnerability matrix structures described below, can be entered for the various components of each of the 30th Street Station and Trenton Station. For example, a small subset of the components of the 30th Street Station can be identified as Ticketing and Structure. The component Ticketing, as an example of a cyber system, includes all aspects of the Ticketing system used at the 30th Street Station, most of which is primarily computer based and involves electronic communications. Additionally, a Structure component of the 30th Street Station can also be identified. Structure includes tangible objects, such as buildings, people, tracks, platforms, etc., that can be found in the physical world of 30th Street Station. A similar division can be made for the Trenton Station.
In the nomenclature of exemplary embodiments of the present invention, the 30th Street Station and the Trenton Station are termed Special Interest Groups, or SIGs. A SIG is the smallest division of a top-level asset which has a meaningful asset value for analysis. A SIG, as described above, can be considered to have components, such as Ticketing and Structure, and each component can in turn have a number of elements. For example, for the 30th Street Station SIG, the cyber component Ticketing can have elements such as Data, Network, Communications, Computers, Software, and others. Similarly, the physical component of the 30th Street Station identified as Structure can, for example, have as elements People and Platforms. Other SIGs can be divided into components and those components further divided into their constitutive elements as may be best appropriate in a given context. Moreover, in exemplary embodiments of the present invention, the asset level which SIGs are sub-divisions of (e.g., N.E. Corridor) is also termed an "enterprise."
Continuing with reference to Fig. 1, at 120 a threat probability and threat vulnerability matrix can be defined for each of the elements of each SIG. Thus, for the 30th Street Station SIG, Figs. 10 through 17 are exemplary threat probability matrices for each of the elements of the SIG components Ticketing and Structure. Additionally, Figs. 18 through 25 contain exemplary raw vulnerability data for each of the five elements in Ticketing as well as each of the three elements in Structure.
Continuing with reference to Fig. 1, at 130 all of the raw threat and vulnerability data for each of the elements in each of the components of each SIG can be processed. The output of such processing is an overall vulnerability matrix at the SIG level, sometimes referred to as a "SIG Matrix," an associated Figure of Merit ("FOM"), and a calculation of Net Risk. A Figure of Merit is a measure of risk, and is used in exemplary embodiments of the present invention to quantify the risk, or more precisely, the fraction of the asset value is vulnerable. Using the FOM the Net Risk of an asset, such as, for example, a SIG, can be calculated. The calculation and significance of an FOM and Net Risk are explained in more detail below, with reference to Figs. 3 and 4.
At 140, once the SIG Matrix values and the FOM are obtained for each SIG, they can be displayed graphically. Fig. 46 depicts an exemplary 3-D display of the 30th Street Station SIG Matrix according to an exemplary embodiment of the present invention. In alternate exemplary embodiments of the invention other displays can be used, such as, for example, 2-D contour maps, bar graphs, or other methods of displaying functions of two variables as are known. A 2-D display can also be formed by collapsing the vertical axis and "looking" from the top down at right angles to the plane of the data points that build the 3-D display. An example of this is shown in Figure 65, which is a 2-D projection of the data presented in 3-D in Fig. 49.
Recalling that SIGs 30th Street Station and Trenton Station are both parts of the Level 4 asset N.E. Corridor, N. E. Corridor is thus an asset of one level higher than each of the 30th Street Station and Trenton SIGs. As noted, such an asset, being immediately above the SIG level, can be referred to as an "enterprise" asset. Using the methods of the present invention, a threat/vulnerability matrix can also be defined at the enterprise level. Such a matrix can be known, for example, as a "Gradient matrix", inasmuch as its output, a function of two variables, can mathematically be considered as a mapping from an input surface to an output surface, or as a 3D surface. A Gradient matrix is thus not an independent entity, but rather, a function of all of the threat and vulnerability data for each of the elements in each of the SIGs which comprise the enterprise. Thus, for example, the Gradient matrix for N.E. Corridor can be found by combining, in a manner to be described more fully below, the SIG level threat and vulnerability matrices for each of the 30th Street Station and Trenton Station SIGs (which comprise the N.E. Corridor). Thus, at 150 an enterprise level Gradient matrix can be created, and an associated enterprise level compounded FOM can be calculated.
Given a Gradient matrix, in exemplary embodiments of the present invention, there is always a FOM that can be associated with it. Therefore, using the FOMs for each of the SIGs as shown in Figs. 6-9, a Net Risk Value can be calculated for each SIG. As described below, while in most cases the objective of a risk analysis is to minimize net risk, in some exemplary embodiments the objective can be to maximize it, as is the case with an opponent's or enemy's asset such as depicted in Fig. 9. At 160 these Net Risk values can further be summed to generate a total Net Risk Value for each enterprise asset. Similarly, Net Risk Value can be calculated for each asset up the asset analysis tree. Net Risk Value is a measure of retained risk in an asset. Thus, with reference to Fig. 6, there can be seen Net Risk Values for each of the 30th Street Station and Trenton SIGs. Their sum is the Net Risk Value shown for the N.E. Corridor enterprise. In turn, as described more fully below, for each Level N asset its Net Risk Value can be calculated by summing the Net Risk Values for the Level N + 1 assets which comprise it.
Finally, at 170, given the Net Risk Value of an enterprise, if it is characterized as too low relative to a defined minimum acceptable value, a risk remediation analysis can optionally be performed until a diminishing returns point is reached. This analysis shall be described more fully below and is depicted for the 30th Street Station SIG in Fig. 60.
II. ASSET ANALYSIS
Fig. 2 is a process flow diagram for the Top-Down Asset Analysis referred to at 110 in Fig. 1. Figs. 6 through 9 depict examples of the output of such a top-down asset analysis for different asset sectors. As noted, for ease of illustration, the exemplary asset analysis depicted in Fig. 6. will be referred to throughout the following descriptions of process flow according to exemplary embodiments of the present invention.
Fig. 2 depicts an exemplary process flow for resolving an asset into its constituent parts. With reference to Fig. 2, at 201, a number L of sub-asset levels in a top-level asset to be analyzed can be identified. Figs. 6-9 depicts exemplary outputs of such a process sector where L=4. With reference to Fig. 6, the top-level asset is thus "Transportation." As shown, it can be divided into three sub-sectors. The first such sub-sector is "Rail." Rail itself can be divided into two "Level 3" assets, namely "Amtrak," and "CSX." Each sub-sector can be termed a "Level 2" asset. Going down one more level, Amtrak can be divided into a number of "Level 4" assets, namely "N.E. Corridor," "Others", and "CA Coastal." Similarly, CSX, the other Level 3 asset constituent of Level 2 asset "Rail," can be divided into Level 4 assets "Ch-Norfolk" and "All Others." Here, the Level 4 assets are enterprise level assets. As noted, an enterprise level asset is an asset at the Lth level, which is subdivided into SIGs. As noted above, a SIG is the smallest division of a top level asset which has a meaningful asset value for analysis. Focusing on the N.E. Corridor enterprise, at 210 N.E. Corridor can be divided into SIGs. The SIGs that comprise the N.E. Corridor can be, for example, 30th Street Station and Trenton Station. As can be seen from Fig. 6, each of the assets can be assigned an asset value known as the "At-Risk Valuation." In exemplary embodiments of the present invention they can also be assigned a Net Risk Value. As noted, a Net Risk Value can be obtained via the calculations described below, the process flow of which is illustrated in Figs. 3 through 5.
Continuing with reference to Fig. 2, at 220 asset values and asset weight factors can be assigned at each asset level down to the SIG level. With referenced to Fig. 6, as noted above, there are asset values in dollars assigned at each of those levels. Additionally, an Asset Weight Factor is the asset value of a particular asset, sub-asset or SIG, divided by all of the other assets, sub-assets or SIGs at that asset level. For example, at the SIG level in Fig. 6 the two SIGs shown are the 30th Street Station and Trenton Station. The 30th Street Station SIG has an asset value of $2.5 billion and the Trenton Station SIG has an asset value of $500 million. Accordingly, the Asset Weight Factor of the 30th Street Station is $2.5 billion divided by $3 billion or 83.33%, and the Asset Weight Factor of the Trenton Station is $.5 billion divided by $3 billion or 16.67%. Continuing with reference to Fig.2, at 230 each SIG can be divided into components and weighting factors for each component can be assigned. This is illustrated, for example, in Figs. 18 through 25, for the 30th Street Station SIG. In this example, this SIG, can be divided into two components, "Ticketing" and "Structure" and each component can be further divided into elements. These weighting factors take into account the fact that in many contexts the contribution of components within SIGs, or elements within components, to the overall risk is not equal. In the illustrated example, at the component level, Ticketing is assigned a weighting of 0.8 and Structure is assigned a weighting of 1.0. At the elemental level each of the elements comprising Ticketing and Structure can also respectively each be assigned a weighting factor. These weighting factors capture the relative importance of the risk data associated with each element and component to the whole. Once 240 has been completed, the asset analysis is complete, and data can be entered for each element of each component of each SIG, the process flow for which is illustrated in Fig. 3, next described.
III. DATA ENTRY FORMAT AND SUB-ASSETS
It is noted that different asset analyses, using different asset valuation metrics, will result in different weightings of both elements within SIG components, and SIG components within SIGs. Nonetheless, the threat probability and vulnerability data are the same for a given element across all valuation metrics. Thus, in exemplary embodiments of the present invention, multiple sets of asset values and weighting factors can be stored in an exemplary system, and a user can run various risk analyses for the same input data, allowing such a user to obtain a multi-dimensional view of the overall risk situation. In exemplary embodiments of the present invention, the conceptual plane for threats can be divided into twelve categories. These categories comprise six cyber categories and six physical
categories. The twelve categories are believed to comprehensively describe the various possible types of threats which any asset, or subdivision thereof, faces. Moreover, these categories describe such threats in an orthogonal or independent way where no category depends upon, or is significantly correlated with, any other category. By using such an exemplary threat taxonomy, a risk analyst is guided to focus in on the various separate threats which any asset faces. The following table contains such an exemplary threat taxonomy.
A. Exemplary Threat Taxonomy
In exemplary embodiments of the present invention, the following exemplary specific definitions
can be used for each Threat category:
Threat Examples of Threats
Cyber Hacker (CHT) Classical: non-malicious hacker actions; mischief; professional, serious, occasional, amateur, lucky, etc.; operates remotely on a network, does not require proximity; intercepts un-encoded transmission for recording; causes copy to be issued or sent electronically, reverse engineering
Physical Cracker/(Hacker) Lock picker, safe cracker, penetrator, impostor: just to prove it (PHT) can be done or for collateral reason, Peeping torn, voyeur, paparazzi, browsing real physical asset elements in containers, dumpster diving, exploiting "loose lips", curiosity or mischief; pilfering, reverse engineering
Cyber Terrorist (CTT) Destruction, subversion, denial of service with malicious intent against cyber elements of an asset; hired agent, angry or motivated current or former employee; directed EMP, co- opted/pressured employee; intercepts un-encoded transmission for recording, manipulation or pilfering Physical Terrorist (PTT) Destruction, vandal, penetration, subversion, sabotage against physical elements of an asset; hired agent, angry or motivated current or former employee; co-opted/pressured employee; exploiting "loose lips"
Cyber Spy (CST) Impersonator, sniffing to determine activity, browsing to view content and activities, shadowing/copying to duplicate functions, insertion of rogue data; hired agent, angry or motivated current or former employee; co-opted/pressured employee; reverse engineering
Physical Spy (CST) Classical; professional, serious, or amateur, etc. engaged in surveillance ("loose lips"), reconnaissance, trespassing, planting devices, dumpster diving, reverse engineering, tampering/subversion of security protections; hired agent, angry or motivated current or former employee; co-opted/pressured employee; reverse engineering
Cyber Criminal (CCT) Theft of information, uploads or damaging code, intentional and malicious cyber activities, hired agent, angry or motivated current or former employee; co-opted/pressured employee, dollar or commodity driver, expert skills available
Physical Criminal (PCT) Classical; professional, serious, amateur, etc. engaged in theft, destruction, tampering, denial of service, vandalism, penetration, subversion with malicious intent against non-cyber elements of an asset; Contract or for-hire activities, dollar or commodity driven, expert skills available; Substitution of asset elements, angry or motivated current or former employee, co- opted/pressured employee; Bribes, blackmail humans, organized crime resources; engaged in surveillance ("loose lips"), trespassing, planting devices, tampering/subversion of security protections; dumpster diving
Cyber Host cyber equipment failure, inadequate sizing of processor
Accidental/Inadvertent speeds, memory, HDD, network, or peripherals, microwave Action (CAIT) oven interference with WiFi; Operator error, side effect or another action, e.g., causing transmission security encryption to fail
Physical Operator liquid spills, fire, extinguisher fluids, cable cuts
Accidental/Inadvertent causing malfunctions; dry atmosphere causing static discharges Action (PAIT) from people into equipment (not cyber gear), failure of UPS supplies, suboptimal use of available resources (buying another fire extinguisher instead of something more effective), people- caused or byproducts of another incident resulting in falling objects or interaction/interference
Cyber Environmental (CET) Electrostatic shock to cyber electronics, collateral EMP, ingress of RF into cyber based processes like SCADA or sensor monitoring (as from radio/TV stations, microwave links, medical equipment, cordless phones, call phone and WiFi PDA), equipment failure
Physical Environmental Force Majeure, weather, non-cyber equipment failure, Structure (PET) failure, power outage, sun spot activity, asteroid/space junk strikes, etc.
In exemplary embodiments of the present invention, a user can, for example, be prompted at a threat grid entry screen to enter the probability of each of the twelve identified types of threats as to each of six identified independent types of vulnerabilities. Such vulnerabilities are also designed to be orthogonal as well as comprehensive.
B. Exemplary Vulnerability Taxonomy
As noted above, in exemplary embodiments of the present invention vulnerabilities can be viewed from independent or orthogonal perspectives, in various vulnerability categories. Each such category is intended to be an intrinsic property of an asset, which is independent of its other properties. Avoiding overlap makes a risk analysis more clear and the remedies more evident as to purpose. Inspired by attributes developed by the NSA for cyber analysis, the following vulnerability categories have been adapted for a much broader range of threat and vulnerability possibilities that can put an asset at risk. If, given changing technologies, economies and political systems, at some point new threat attributes arise that are truly independent from the existing set, new vulnerability value panels can be created. Thus, the taxonomy is scalable. As noted, in exemplary embodiments of the present invention vulnerability classes are seen as intrinsic properties of an asset itself. Exemplary specific definitions that can be used for each vulnerability category are as follows:
Figure imgf000019_0001
Figure imgf000020_0001
Thus, in exemplary embodiments of the present invention, the above provided threat and vulnerability definition tables provide an analyst or other user with a perspective that can be used for entering data in each cell in respective Threat and Vulnerability grids. Threat values can be normally provided by outside sources. Vulnerability values can either be provided by analysts using templates provided by an exemplary application according to the present invention or from external sources. Such analysts can, for example, utilize independent measures, such as, for example, actuarial and handbook failure data. Such analysts can, for example, glean vulnerabilities for cyber as well as physical assets form industry sources.
In general vulnerability numerics entered need to be on an exponential scale to adequately cover the range of vulnerabilities. This is because a risk evaluation range should be expansive and look to the potential failure of an aspect of the element in question with as much granularity as available. For example, failure can occur in parts per thousand, million or more. To achieve a net-risk valuation of $50,000 on an asset worth $5 billion requires a discrimination granularity of 100,000 to l. In exemplary embodiments of the present invention, a given software program can, for example, allow a user to immediately access the definition and some common examples of the various types of threats and vulnerabilities by, for example, clicking on the main threat or vulnerability. Alternatively, a given system could allow a user to press a button or right-click on a mouse within a particular cell in a threat vulnerability matrix and thereby bring up the definitions of the relevant threat and the relevant intersecting vulnerability as well as seek common examples of that type of threat and that type of vulnerability so as to be better able to enter threat and vulnerability data.
In contrast to such a comprehensive and orthogonal taxonomy to identify risks as described above, Table A below illustrates some conventional analogs to the six vulnerability categories used in exemplary embodiments of the present invention. With reference thereto, each "alternate term set" row depicts a conventional alternate term set for categorizing the vulnerability conceptual plane. As shown in the first, second, third, fifth and sixth rows of alternate term sets, the cells with asterisks are typical groupings of conventional terms that either do not cover the full range of vulnerabilities or are not independent of each other.
Vulnerability/Protection Semantics
Figure imgf000022_0001
Figure imgf000022_0002
Table A
Figs. 10 through 17 depict exemplary threat probability matrices for each of the exemplary twelve threat types and six vulnerability types described above. In exemplary embodiments of the present invention, threat probabilities can have any value from 0 to 1 which correlates to a percentage of likely occurrence between 0% and 100%. Alternatively, the range between 0 and 1 could be expressed using three or more decimal places, in which case the range would not be construed as percentages but rather as a probability scale corresponding to the number of decimal places used. Thus, for example, threats could be expressed as having a probability of X in a thousand (using three decimal places), Y in a million (using six decimal places), etc., depending upon the number of decimal places allowed in a given embodiment for the entry of threat probability data.
As noted above, for purposes of illustration the present invention has been described using a fictitious (yet reality-inspired) N.E. Corridor example. This exemplary N.E. Corridor is composed of two SIGs, 30l Street Station and Trenton Station. For purposes of illustration, an analysis of the 30th Street Station will be shown in detail as to components and elements of said components. Thus, Figs. 10 through 17 depict element level Threat Probability Matrices, where a separate matrix is input for each of the respective eight elements in the two components of the 30th Street Station. As can be seen from Figs. 10 through 17, the 30th Street Station SIG can be divided into two components, "Ticketing" and "Structure." The component "Ticketing" can be further divided into five elements, namely "Data," "Network," "Comms," "Computers," and "Software," and the component "Structure" can be further divided into three elements, namely "Building," "People," and "Platforms." This schema will be used in the following description to illustrate the entry^and processing of data in exemplary embodiments of the present invention. IV. SIG LEVEL PROCESSING
Next described is the processing of input data at the SIG level.
With reference to Fig. 3, at 301 a user can input threat and vulnerability source data for each element of each SIG component. For example, Figs. 10 through 17 are exemplary Threat Probability Matrices and Figs. 18 through 25 are exemplary Vulnerability Matrices for each of the elements comprising the 30th Street Station SIG. The cell values in the matrices of Figs. 18 through 25, as is the case with entries in any vulnerability matrix in to exemplary embodiments of the present invention, are on a scale of 1 through 15. These numbers are actually negative exponents. Thus, this scale can be chosen, in exemplary embodiments of the present invention, to represent the likelihood or probability of occurrence of a given threat against a particular vulnerability.
The values entered in the Vulnerability Matrix correspond to the odds of occurrence expressed as 1 in 10(Ce" Value). Accordingly, a cell value of 1 in such a matrix means that there is a likelihood of 1 in 10 that that particular vulnerability could occur in the presence of a relevant threat, 2nd a value of 15 is interpreted to mean that the likelihood is 1 in 1015 that such a vulnerability could occur. Obviously, because these numbers are actually negative exponents, the higher the number the better protection there is for that vulnerability; i.e., the lower the risk that a particular threat/vulnerability combination poses to the asset in question.
Assuming that Threat Probability Matrices and the Vulnerability Matrices have been entered for each element of each component of the SIG, at 310 this data can then, for example, begin to be processed according to the methods of the present invention. In general, the risk probability can be decomposed into two key components: threat probability and vulnerability probability. The probability of risk can be expressed as: (threat probability) * (vulnerability probability), or PrjSk = Pthreat * P vulnerability. In exemplary embodiments of the present invention, these probabilities can be combined into a compounded value, for ease of calculation. Thus, at 310 a Compounded Source Cell Value can be calculated for each element, as shown in Figs. 18 through 25. This involves combining the data in Figs. 10 through 17 with that of Figs. 18 through 25, respectively, according to the following rule:
Compounded Source Cell Value ("CSCV") = Vulnerability Matrix Cell Value + LOG [1 /Threat Probability Matrix Value].
Mathematically, because the vulnerability values are logs (exponents) and the threat probability values are non-exponent (i.e., value from 0 to 1), it is necessary to form a resultant exponent. Thus, a threat probability matrix value can be converted to a logarithm and added to, the associated vulnerability matrix cell value. Because the compounded threat-vulnerability cell values are thus actually negative exponents, they get more negative to represent a smaller number. The unsigned value then increases as that cell's value is made less contributing. It is assumed that the threat probabilities can have different values for each type of vulnerability and threat type, i.e., for each cell in the threat probability matrix.
For example, for the cell in the first row and first column of the "Data" element of the "Ticketing" component of the 30th Street Station SIG, wherein the corresponding Threat Probability Matrix is shown in Fig. 10 and the corresponding Vulnerability Matrix is shown in Fig. 18, the Cell Value for the Compounded Threat- Vulnerability Matrix is the Cell Value of the Vulnerability Matrix, in this case 6, added to LOG [1 /Threat Probability Cell Value]. Here, where the Threat Input Cell Value = 1, as can be seen from Fig. 10, the Compounded Source Cell Value = 6 + LOG [1/1] = 6 + LOG [1] = 6 + LOG [10°] = 6 + 0 = 6. Obviously, for ease of illustration the Threat Probability Matrices, i.e., Figs. 10 through 17, were defined such that there is 100% probability of occurrence of each of the relevant threats against each of the identified vulnerabilities except in the case of cyber threats against people and physical structures wherein the example shows a nil threat. Because in such a case the formula would become indeterminate the exemplary application has a Boolean logic override for the entry of a "0" which makes the corresponding cell value N/A (not applicable).
As can be seen from the equation used at 310, for threat probabilities less than 100% the Compounded Threat Vulnerability Matrix Data will be greater in each cell than the original Vulnerability Matrix. This is understandable inasmuch as larger numbers (i.e., larger negative exponents) translate to better protection and a lesser likelihood of a particular threat succeeding against a given vulnerability.
Using the exemplary data shown in the exemplary Threat Probability Matrices of Figs. 10 through 14 (i.e., all cells = 1), corresponding Compounded Threat- Vulnerability Matrices will be identical to the Threat Vulnerability Matrices as is shown in Figs. 18 through 22. As noted above, for the case where a threat probability value is entered as a "0" for irrelevance, the application uses a Boolean override to set the value to N/ A to avoid reducing the equation at 310 to an indeterminate expression.
Given the Compounded Threat- Vulnerability Matrix Data from 310, at 320 the Compounded Threat- Vulnerability Matrix Data for each element can, in exemplary embodiments of the present invention, be transformed to Weighted Value Matrices at 320.
Weighted Value Matrices implicate the importance weighting factors assigned at the elemental and componental Levels as is illustrated in Fig. 2, at 230 and 240, and in Figs. 6-9. Accordingly, with reference to Figs. 18 through 25, as between the two components of 30th Street Station Ticketing and Structure, Ticketing has a component weighting of 0.8 and Structure has a component weighting of 1.0. Similarly, within the Ticketing component at the elemental level, the following elements of component Ticketing have the following weightings: Data 1.0, Network 0.9, Comms 0.9, Computers 0.8 and Software 0.6. Within component Structure, the following elements have the following weightings: Building 0.8, People 1.0, and Platforms 0.7. In exemplary embodiments of the present invention the weightings are user-assigned relative measures of importance or contribution and thus there is no requirement that they sum to unity or any other fixed number.
The weightings at each of the elemental and componental levels can be, for example, used to transform the Compounded Threat- Vulnerability Matrix Data at 310 Weighted Value Matrix Data at 320. This is illustrated at 320 all with respect to Fig. 3. To create the Weighted Value Matrix Data, each of the Compounded Threat- Vulnerability Matrices, such as are shown, for example, in Figs. 18 through 25, can be, for example, processed using the following equation:
Weighted Cell Value ("WCV") = Compounded Source Cell Value (CSCV) * (LOG [1/(Element Weight * Component Weight)], or WCV = CSCV * LOG [1/EW * CW)].
Thus, the Compunded Threat- Vulnerability Matrices of Figs. 18 through 25 can be transformed to the Weighted Value Matrices of Figs. 26 through 33, respectively.
Continuing the processing of data at the elemental level, at 330 the Weighted Value Matrices can, for example, be transformed to create Build Coefficient Matrices. Exemplary Build Coefficient Matrices can be generated by processing the Weighted Value Matrices of Figs. 26 through 33 to yield those shown, for example, in Figs. 34 through 41, respectively. The relationship between a Weighted Value Matrix for a given element of a given component of a given SIG, and the corresponding Build Coefficient Matrix ("BCM") for that element can be expressed via the equation
Build Coefficient Cell Value (BCCV) = 1- 10 ( WCV) as shown at 330. In exemplary embodiments of the present invention, the reason for this transformation is that Build Coefficients are patterned after reliability factors. Thus, if a failure rate is one part in one thousand (.001) then a corresponding reliability factor is (1 minus the failure rate) or 0.999, for example. Thus, a Weighted Value Matrix cell value is essentially a negative exponent of 10, illustrating or expressing a vulnerability to a given threat. Therefore, a higher number corresponds to a more negative power of 10 and thus a larger denominator and an overall smaller vulnerability number. Taking the Weighted Value Matrix Cell Values and converting them the Build Coefficient Matrix Cell Values thus involves subtracting 10 to the negative power of the Weighted Value Matrix Cell Value or 10("WCV) from unity. Accordingly, if the highest cell value in the matrices Figs. 26 through 33 were 15, the corresponding highest reliability would translate to 1-10"15 = BCCV 0.999999999999999 (15 nines). Given these mathematical constructs, the closer a Build Coefficient Matrix cell value is to unity the less vulnerability there is to a given threat type. Alternatively, although you can never have 100% reliability, the closer one can get the better off one can be.
Next described is how the BCM data for each element in a SIG can be combined in various ways to generate a set of SIG output data in exemplary embodiments of the present invention.
At 335 in Fig. 3, the various Build Coefficient Matrices for each component can be combined to generate a Component Level Build Coefficient Matrix by cellwise multiplication. For the illustrative example, Figs. 34, 35, 36, 37, and 38, can be combined via cellwise multiplication to generate a component level Build Coefficient Matrix as shown in Fig. 42 for the component "Ticketing" of the 30th Street Station SIG. Similarly, Figs. 39, 40, and 41 can be combined via cellwise multiplication to generate a component-level Build Coefficient Matrix as shown in Fig. 43 for the component "Structure".
It is noted that the original Vulnerability Matrices for the elements of the Structure component of the 30th Street Station SIG did not have data for any cyber threats. This is because this component did not need to deal with any asset aspect that is susceptible to cyber threats, inasmuch as this component's elements are solely physical or tangible objects, namely, "Building", "People", and "Platforms." Accordingly, that lack of data is carried through in Figs. 31-33, the corresponding Weighted Value Matrices, which also show no values for any cyber threat. It is noted that the cell values for all cyber threats in Figs. 31-33, Figs. 39-41 and Fig. 43 are shown as "#VALUE!" which indicates that no value has been entered for such cells. Therefore, when it is necessary to generate a Component-Level Build Coefficient Matrix for Structure, no value will be given to any cell associated with a cyber threat. To deal with such cells mathematically, in exemplary embodiments of the present invention, when creating higher level matrices via cellwise multiplication, any non-existent cell value such as is seen in Fig. 43 represented as "#VALUE!" can be simply assigned a value of 1 by a Boolean logic element so that it has no effect on such cellwise multiplication product when multiplied with an actual cell value. Accordingly, Fig. 43 illustrates an exemplary Compounded Structure Component Build Coefficient Matrix, generated by cellwise multiplication across the matrices shown in Figs. 39 through 41 which simply perpetuates the no-value in each cell associated with a cyber threat, and Fig. 44 illustrates an exemplary SIG Level Build Coefficient Matrix which perpetuates the values for cyber threats of Fig. 42, the Compounded Ticketing Component Build Coefficient Matrix, unchanged. Given the Compounded Component Build Coefficient Matrices shown in Figs. 42 and 43, at 340 a SIG Level Build Coefficient Matrix can be generated by cellwise multiplication across the various Compounded Component Build Coefficient Matrices. In the illustrative example this entails cellwise multiplication of Figs. 42 and 43. However, as noted above, any cell which has no value is assigned a value of 1. Thus, because there are only two components in the 30l Street Station SIG, the only values in the SIG Level Build Coefficient Matrix for all cyber threats, are those values from Fig. 42 which are those of the Compounded Ticketing Component Build Coefficient Matrix. Fig. 44 is thus the product of cellwise multiplying the matrices of Figs. 42 and 43, and is thus a SIG Level Build Coefficient Matrix incorporating the data from both the Ticketing and Structure components of the 30th Street Station SIG.
Continuing with reference to Fig. 3, at 350 a SIG Level Build Coefficient Matrix, for example, can be transformed to a SIG Matrix, i.e., a Gradient matrix for the SIG. As shown at 350 to generate the SIG Matrix (Fig. 45), each cell in the SIG Level Build Coefficient Matrix (Fig. 44) is transformed using the formula:
SIG Matrix Cell Value = -[LOG (1 -SLBC)].
This mathematical transformation takes a build coefficient reliability factor type cell value and transforms it back to a number which represents a negative exponent of 10 to express the opposite of vulnerability or "index of protection", where the lesser vulnerability the higher the number. The SIG Matrix value is of the same type as was the original Vulnerability Matrices and the Compounded Threat- Vulnerability Matrices. These numbers, therefore, run on a nominal scale of 1 through 16 (or, where the threat probabilities are all 100%, on a nominal scale of 1 to 15). An exemplary SIG Matrix generated from the SIG Level Build Coefficients Matrix of Fig. 44 is depicted in Fig. 45. Fig. 45 is thus a mathematically created matrix representing the overall vulnerability of the 30th Street Station SIG.
Continuing with reference to Fig. 3, at 360, a mathematical quantity known as a "Figure of Merit" or "FOM" can be calculated from the SIG Level Build Coefficients Matrix. An FOM measures the total reliability or "invulnerability" of a SIG. A FOM can be generated, for example, by multiplying all of the cells in the SLBC Matrix together, and then converting this product to an exponent (of 10) to indicate what an equivalent hybridized overall SIG Value is. An FOM is somewhat analogous to a GPA score for academic performance but without the effect of high scores averaging out failures or very low scores. The non-linear FOM representation does not allow very low scores or "failures" to be hidden from view or computation.
Thus, continuing with reference to Fig. 3, at 360, using a SLBC Matrix, an FOM can be calculated at 360 according to the formula
FOM = -LOG[I- { Fl(SLBCiJ Values from i,j =0,0 to ij = M,N)} ].
For example, from the exemplary SLBC Matrix of Fig. 44 an exemplary FOM of 0.128 can be generated. The FOM and the SIG Matrix Values can be output to the next level up in the asset analysis hierarchy. Next, at 370 Net Risk for the SIG can be computed using the following formula:
Net Risk = SIG Level Asset Value * 10 (SIG F0M).
These calculations are illustrated in Fig. 46. Fig. 46 is a color-coded three-dimensional depiction of the SIG Matrix Values of Fig. 45. Fig. 46 also presents the FOM for this SIG, computed as described above. Fig. 47 once again presents the SIG Matrix Values of Fig. 45 and additionally illustrates the Net Risk Figure for the 30th Street Station SIG.
Thus, for a SIG FOM of 0.128 for the 30th Street Station, calculated as provided in 360 of Fig. 3, the Net Risk is a quality 10"(S1G F0M) can be generated called the "Risk Multiplier", using the equation:
Figure imgf000033_0001
n ■ ■α%•?ri%>■ U
Figure imgf000033_0002
V VoIlln VIαW * D I %i■o WlIXr I IWVIIi Vl il It Vll
Figure imgf000033_0003
a SIG Net Risk can be calculated. Accordingly, using a SIG FOM of 0.128280, a Risk Multiplier can be generated by 1(T0 128280, which yields 0.744252, or 74.425%. A Risk Multiplier is a function of an FOM representing the fraction of the valuation of the SIG (or other asset) which is at risk given that FOM. Thus, for a 30th Street Station valuation of $2.5 billion and an FOM of 0.128280, 74.425% of the total Asset Value is at risk given the current threat vulnerability data encoded in the FOM, yielding $1,860,631,568 as the net risk associated with the 30th Street Station. It is noted that this is a very large at-risk value. This is due to a very low FOM (any FOM less than 1 is considered exceedingly low - putting large fractions of asset value at risk). Thus, the 30th Street Station SIG is a prime candidate for an optional risk remediation process, as described below with reference to Fig. 4.
Finally, at 380 the FOM for the SIG, the SIG Matrix and the SIG Net Risk can be output "upwards" to be used in calculations at the enterprise level. In the illustrative example the SIG Matrix Values for the 30th Street Station SIG can be output to the next level to be combined with analogous SIG Matrix Values for the Trenton Station SIG to thereby arrive at a combined Enterprise Level Gradient Matrix for the N.E Corridor, an enterprise level asset. From an Enterprise Level Gradient Matrix an enterprise level FOM can be calculated. It is noted that the N.E. Corridor, an enterprise level asset, is at the L1 Asset Level with reference to 210 in Fig. 2 where L=4. It is also noted that in the top-down asset analysis process, sub-assets at the Lth asset level are divided into SIGs, and thus illustrated the N.E. Corridor, a sub-asset at the Lth (4th) asset level, was divided into the two SIGs of the 30th Street Station and Trenton Street Station.
V. ENTERPRISE LEVEL PROCESSING A. Gradient Calculations
Next described is the generation of a Gradient matrix at the enterprise level with reference to Fig. 4. Having obtained a SIG Matrix for each SIG (such as, for example, those depicted in Figs. 47 and 48) and a Net Risk Value corresponding to an FOM, these values can be input to a higher level of analysis to compute enterprise level Gradient matrices and total net risk for the enterprises which such SIGs comprise.
In order to generate a Gradient matrix for an enterprise, such as, for example the N.E. Corridor, Gradient data from each contributory SIG needs to be combined into an overall measure for the enterprise as a whole. This can be done, for example, by forming "Real Value Factors" ("RVFs") from a SIG matrix. Because the values in the SIG Matrices are exponents, in order to be manipulated and combined they need to be converted to regular numbers (or "real values"). In exemplary embodiments of the present invention, such real values can, for example, be termed "Real Value Factors." Accordingly, at 410 for each SIG Matrix, RVFs can be calculated using the formula:
RVFjj = 10-(SIG ValuV, over ij =0,0 to ij =M,N.
Once STG Matrix values have been converted to real numbers in the form of RVFs, they can be linearly combined to create an Enterprise Level Gradient Matrix. Such combination, in exemplary embodiments of the present invention, can be linear and can, for example, be weighted by the relative value of each SIG (whether in dollar terms, quality of life terms, lives at risk, and any other convenient metric as may be useful) within the enterprise. This is illustrated for the exemplary N.E. Corridor example in Figs. 47 and 48. Fig. 47 shows that the at-risk valuation of the 30th Street Station SIG is $2.5 billion. Similarly, Fig. 48 shows that the at-risk valuation of the Trenton Station SIG is $500 million. Therefore, the total at-risk valuation of the N.E. Corridor enterprise is $2.5 + $0.5 billion = $3 billion. These numbers are provided in Table
B below.
Table B
Total Asset value is Proportioned Matrix Risk Multiplier $3.0 Billion Valuation used in compounding
"up"
Trenton $500,000,000 0.166666667
30thStation $2,500,000,000 0.833333333
Total N.E. Corridor $3,000,000,000 1.000000000
The quotient defined by the at-risk valuation of a SIG divided by the total at-risk valuation of all SIGs in a given enterprise can, in exemplary embodiments of the present invention, be termed the "Risk Multiplier." It is the Risk Multiplier that can be used to transform RVFs derived from SIG Matrices into proportional RVFs ("PRVFs") which are weighted RVFs. Thus, at 415 the Risk Multiplier for each SIG wixhin an enxerprise can be calculated using the formula:
RM = SIG at-Risk Value / ∑ at-Risk Valuek
over all SIGS from 1 to k comprising the relevant enterprise. The Risk Multiplier can then be multiplied by each element in a RVF Matrix to generate a proportional RVF ("PRVF") Matrix for each SIG as shown at 420 using the formula:
PRVFn IJ = RVFn* RM. Thus, Fig. 50 depicts an RVF matrix for the 30th Street Station SIG, and Fig. 51 depicts its corresponding PRVF matrix. Fig. 51 was generated by multiplying each element of Fig. 50 by a RM of 83.33% (see Fig. 6)
Similarly, Fig. 52 depicts an exemplary RVF matrix for the Trenton Station SIG, and Fig. 53 its corresponding PRVF matrix; generated by multiplying the RVF matrix of Fig. 52 by an RM of 16.67%.
Continuing with reference to Fig. 4, at 430 various proportional RVF Matrices can be combined by cellwise addition to generate a consolidated RVF Matrix, or CRFV for the enterprise. This can be done, for example, using the following formula:
CRVFij = ∑ (SIGk)1J
for all SIG Valuesυ over all SIGs 1 to k in the enterprise. At 440 the real numbers of a CRVF matrix can be once again transformed to exponents, thereby generating an Enterprise Level Gradient Matrix ("ELGM") using the formula:
ELGMij = - LOG (CRVFjj).
A CRVF Matrix for the exemplary N. E. Corridor enterprise is depicted in Fig. 54, and an associated ELGM is depicted in Fig. 58. Fig. 59 is a multi-colored 3D graphic display of the ELGM of Fig. 58. Thus, the 3D surface of Fig. 58 can be seen as a complex weighted combination of the 3D surfaces of Figs. 46 and 49. Due to the more pronounced contribution of the 30th Street Station SIG relative to that of the Trenton Station SIG, the surface depicted in Fig. 59 appears to be more similar to the 30th display of the 30th Street Station SIG Matrix depicted in Fig. 46. This would also be evident in a corresponding 2-D figure similar to that in Fig. 65. As noted above in connection with SIG matrices, Gradient matrices at the enterprise or any higher level can also be displayed as a 2D contour map or by using any other technique to display functions of two variables as is known.
At 445, from an ELGM an Enterprise Level Build Coefficient Matrix can be generated, using the formula:
ELBCMjJ = [1 -10 (ELGMij)].
Fig. 57 depicts such an ELBCM for the exemplary N.E. Corridor enterprise. Λs is described above in connection with SIG level processing, a build coefficient matrix can be used to generate an FOM. Here, at 450, an enterprise level compounded FOM ("ELCFOM") can be calculated from an ELBCM using the formula
ELCFOM = -LOG[I -{ fl (ELBCMy from ij =0,0 to i,j = M,N)}]
Given an ELGM and a corresponding FOM generated from a related ELBCM, at 460 they can both be displayed. Fig. 59 depicts an FOM for the exemplary N.E. Corridor enterprise and a 3D multicolored graphic display of an associated ELGM. It is noted that the FOM value of 0.173 is dangerously low, which indicates that iterative risk remediation may be a useful option, as described below.
Finally, at 465 the Net Risk for the enterprise can be calculated by simply summing the individual Net Risk values for each SIG in the enterprise which were output to the enterprise processing at 380, of Fig. 3. Thus, with reference to Fig. 6, the Net RiskN E comdor = Net Risk3oth
Street Station + Net RiskTrenton Station, Or Net RiskN E Corridor = 1 ,860,667,466 + 5,674,000 =
1,866,341,446.
As can be seen from the process flow of Figs. 3 and 4, and as will be described below in connection with Figs. 60-63, at all levels above the enterprise level the output set from the immediately lower level is all that is needed to generate data for that asset level. Thus, while detailed input data is need to generate SIG Matrices and SIG FOM, at the enterprise level all that is needed to generate Enterprise Level Gradient Matrices and Enterprise FOM and Net Risk are SIG Matrices and SIG Net Risk Values, precisely the outputs of SIG processing as shown in Fig. 3. Thus each level need not see the original input data or any outputs of any level lower than its immediately lower level. I.e., for all asset levels N above a SIG, the only input data needed is the Level (N+ 1) Gradient Matrix and the Level (N+ 1) Net Risk Value. From this, in completely analogous fashion to the generation of enterprise output data form SIG output data, a Level N Gradient Matrix and a Level N Net Risk Value can be calculated. This functionality is significant inasmuch as it allows for risk analysis of large assets (such as the Transportation, Pharmaceutical or Automotive sectors illustrated in Figs. 6-8, respectively) where the various levels of assets may be in entities not desiring to share risk data with higher levels.
Thus, combining said data to generate an overall risk score and output threat-vulnerability matrix for all asset levels above a lowest sub-asset level can be performed solely using the output data of the immediately lower level, without any need for the original input data. Accordingly, in exemplary embodiments of the present invention output data can be insulated from input source data so that objective numerical results can be compounded without revealing the risks of individual contributors in non-federated analyses. This can be desirable where, for example, a given asset or sector has one or more lower level sub-assets that are sovereign entities.
B. Iterative Risk Remediation
In exemplary embodiments of the present invention, given an enterprise level compounded FOM value and an associated Enterprise Level Gradient Matrix, a user can optionally proceed to an iterative FOM improvement process, at 470, if the FOM is low or if it is obvious that there are certain values in the ELGM that are unacceptably low, relative to a defined value or values.
Thus, for example, in the exemplary N.E. Corridor Enterprise the ELGM value of 0.79 in the cell at the intersection of threat "Physical Terrorist" and vulnerability "Discernable" is unreasonably low. Similarly, the intersection of threat "Physical Environmental" and vulnerability "Accessible" of 0.53 is also unreasonably low. In general, any vulnerability score less than 2 (which correlates to a probability of occurrence greater than or equal to 1 in 100) is unacceptably low in most contexts. A cell value less than 1 correlates to a probability of occurrence greater than 1 in 10, which is almost always unacceptable. If these values can be raised then perhaps the overall FOM can be raised to a more acceptable number, resulting in a lesser portion of the value of the enterprise being at risk. Such an iterative FOM improvement process is illustrated in Fig. 5, and an example of such process illustrated in Fig. 64.
As a general guide, the target for remediation is any cell value below an FOM value + 1 that would produce acceptable risks. For example, an FOM value of 4 might be an acceptable risk value, so any cell with a value less than 5 could be investigated. The reason for such an increase in target value for cells is because cell values are multiplied together, a product will always be lower than any individual cell value.
Because an Enterprise Level Gradient Matrix is composed of derived elements, the only way to raise an overall enterprise level FOM enterprise level is to raise the corresponding vulnerability values at the elemental level. This can be done, for example, by going back to the element level vulnerability matrices contributing to the low vulnerability value and measuring the costs of remediating those risks with low values so as to increase their values compared with the associated reduction in the proportion of the asset that is at risk as a result of the remediation.
In general, as long as the remediation costs are less than the change in Net Risk Value by an acceptable ROI factor, it is worthwhile to pay that remediation cost, recalculate the Net Risk Value and the FOM, and inquire as to whether further remediation, given its cost(s), would continue to decrease the Net Risk Value of that asset by a still greater amount than the associated remediation cost. This process will next be described in detail with reference to Figs. 5 and 64.
With reference to Fig. 5, at 501 the vulnerabilities in the element-level matrices that have values below the initial FOM can be selected, net risk values are collected, and remediation costs to bring those cells to a value of at least one higher than the initial FOM can be calculated. At 502, Gradient calculations at the relevant SIG levels can be rerun and the reduction in retained risk values observed. As noted above, there are various cost metrics that can be associated with assets in exemplary embodiments of the present invention. Therefore, although for the illustrated exemplary N. E. Corridor enterprise, dollars have been used in alternative exemplary embodiment, lives at risk, or quality of life impact, as well as a variety of other cost metrics or valuation metrics associated with an asset can be used.
For example, in the enemy targets example depicted in Fig. 9, as opposed to merely placing more dollar value of assets at risk, it may be desirable, in exemplary embodiments of the present invention, to maximize the "strategic value" of various assets which are at risk. Such a "strategic value" may not directly correlate with dollar values but may represent the value of such an asset to an enemy in operating militarily. Numerous other exemplary asset valuation metrics can be used as may be appropriate or desirable in various contexts.
Continuing with reference to Fig. 5, at 503, the return on investment or ROI can be calculated using the following formula:
ROI= Reduction in Lth Level Net Risk Value / ∑ SIG remediation costs up to the Lth Level
Finally at 504, processes 501, 502, and 503 can be repeated until the ROI fails to meet investment/remediation-yield criteria. In general, such investment/remediation-yield criteria can be set by a user or by a special risk analyst. Such an analyst could be, for example, either the same analyst that performed an asset analysis process and generated the data input with the threat and vulnerability matrices, or, for example, a different type of asset analyst sometimes known in the art as a "risk governor".
Fig. 5 depicts process flow for an exemplary iterative risk remediation process at the enterprise level according to an exemplary embodiment of the present invention. To illustrate such a process with actual numbers would be rather complicated, inasmuch as each element of each component of each SIG would need to be analyzed for ways to improve (i.e., increase the values of) its respective vulnerability matrix entries and the costs of such remediation calculated. Each such iteration could result in a change to the FOMs for each SIG and a resultant recalculation of the FOM for the enterprise as described in connection with Fig. 4.
Alternatively, sometimes the source of a low enterprise level FOM is a SIG with a large Asset Weight Factor that has a significantly lower FOM than the other SIGs in the enterprise. For example, in the N. E. Corridor example, the 30th Street Station SIG fits just such a profile. It has an Asset Weighting Factor of 72.289%, and it thus contributes significantly to the overall N.E. Corridor FOM, and Us FOM is an exceeding low 0.128. Thus, risk remediation at the 30th Street Station level may be sufficient to solve the problem. Such a process is illustrated in Fig. 64. With reference to Fig. 64, there are seven columns labeled 6401, 6403, 6405, 6407, 6409, 6411 and 6413. Each column contains a different type of metric associated with a given set of threat and vulnerability matrix values. There are also six rows labeled "Baseline analysis," and "Next analysis- 1" through "Next analysis-5." Each such row represents a different data set where a different FOM (based on changing one or more of the values of said threat and vulnerability matrices) gives rise to different Net Risk values. There is a certain defined remediation cost associated with moving from each row to the one below it. Beginning at the first row, the Baseline analysis row, this is the starting point for the iterative risk remediation process. Thus, the FOM listed in the baseline analysis row is the same FOM provided in Figs. 46 and 47 for the 30th Street Station, namely 0.128. As noted above, this figure is very low, and therefore the 30th Street Station SIG is a prime candidate for an iterative risk remediation process.
Continuing with reference to Fig. 64, column 6405 provides a Net Risk Multiplier associated with each FOM. As the FOMs in column 6403 increase, the Net Risk Multiplier, which represents the fraction of the assets value which is at risk, will decrease. Associated with each Net Risk Multiplier, in column 6407 is a Net Risk Value, or actual dollar value at risk. The Net Risk Value is the asset value of the asset, here the 30th Street Station, multiplied by the Net Risk Multiplier. Column 6409 provides the remediation cost in moving from each row to the next row below.
Obviously, wherever the remediation cost is less than the corresponding net risk savings, or diminution in the Net Risk Value, it is worthwhile to make the move, pay the remediation costs, and bring the FOM to the next point. Column 6411 thus provides a Return On Investment ("ROI") which is an expression of the relationship between the remediation cost of moving to a new FOM and the savings in retained risk, or Net Risk Value, associated with that new FOM. Finally, column 6413 provides a different measure of net risk. It is not a dollar figure or other monetary or financial valuation, but rather the lives-at-risk associated with a given FOM. For the purposes of column 6413, in an analogous fashion to the asset value for the 30th Street Station in column 6407 of $2.5 billion, there is a maximum number of lives that are implicated by the 30th Street Station, i.e., 1258 lives. Associated with each FOM is a certain fraction of those lives that are at risk. Increasing the FOM thus decreases the Net Risk Multiplier and thereby decreases as well the lives that are at risk.
Continuing with reference to Fig. 64, at the Baseline analysis row, with an initial FOM of 0.128 the associated Net Risk Multiplier is 74.427%. Thus the initial retained risk for the 30th Street Station is equal to .74427 x $2.5 billion, or $1,860,667,446. To move from a starting FOM of 0.128 to the next FOM of 1.92, there is a remediation cost of $16,200,000. This FOM will decrease the Net Risk Value of the 30th Street Station to $30,056,61. Therefore the ROI is equal to the original Net Risk Value of $1,860, 667,446 less the ending net risk value of $30,056,611, divided by the remediation cost of $16,200,000, or [(1,860, 667,446 - 30,056,611)/16,200,000] = 113. An ROI greater than 1 is generally beneficial, and an ROI greater than 100 represents a significant return on the invested risk remediation costs.
Proceeding in this manner for each subsequent row, the values in the remainder of the rows in Fig. 64 can be similarly generated. In general, an iterative risk remediation process can continue until a DRP, or Diminishing Return Point, is arrived at. As can be seen in Fig. 64 the Diminishing Return Point is achieved when the investment cost equals, or substantially equals, the prospective reduction in retained risk, or when the ROI is substantially equal to unity. In other words, when the remediation cost is greater than or equal to the associated decrease in net risk value, it simply does not make economic sense to further remediate the risk. This is seen in the row entitled Next Analysis-5, where the DRP has been reached. That the DRP has been reached can be seen by noting that the starting FOM is 4.83 and the anticipated ending FOM ending is 4.90 which translates to a Net Risk Value diminution of ($36,978-$31,473) = $5,505 (as shown in column 6407), which is essentially equal to the cost of the risk remediation required to move from the starting FOM of 4.83 to the ending FOM of 4.90 of $5,500 as shown at the bottom of column 6409, resulting in an ROI of 1 in column 6411. Thus, at this point the iterative process stops and the FOM has been increased as much as it can be given a risk remediation expense that is economically advantageous. Thus, as a result of the iterative risk remediation process the Net Risk has been reduced by $1,860,667,446 - $36,978 = $1,860,630,468 at a cost of only $17,189,500, for an overall ROI of 108.24 for the remediation process. Moreover, as a result of the risk remediation process the percentage of the 30th Street Station which is at-risk has been reduced from 74.427% to only .0015%.
Alternatively, where an asset valuation metric is not financial, but rather associated with human lives or quality of life, a risk remediation analysis may want to continue irrespective of the costs of remediation even if such costs are considered unreasonable given the restrictions of budget and resources. Thus, using the asset valuation metric of column 6413, a risk remediation process could, in exemplary embodiments, proceed until there are no lives at risk, regardless of the costs of such remediation. VI. COMPLETE PROCESSING EXAMPLE
To illustrate the complete processing of data from data entry at the SIG component level up through calculation of output data for a Level 3 asset, an example of such processing will be described for an exemplary Amtrak system, a sub-asset of the Rail sub-sector of Transportation, as shown in Fig. 6. Such processing is illustrated with reference to Figs. 60-63, next described.
Figs. 61 through 63 trace the processing of data from input at the component level through calculation of the FOM, Net Risk Value, and Level 3 Asset Gradient Matrix (one level above the enterprise level) which can be graphically displayed as a 3-D multicolored surface (or as any other representation of a function of two variables, as noted above, such as for example a 2D contour map, a 2D "hot spot" map, etc.) for the Level 3 Asset Amtrak. Fig. 60 is a end to end depiction of this process, which originates on the far right side of Fig. 60 and terminates on the far left side of Fig. 60. Figs. 61 through 63 each represent approximately one-third of the process flow depicted in Fig. 60, for ease of description and illustration. Fig. 61 depicts a portion of the processing flow from the inputting of data at the SIG component level to the generation of outputs at the SIG level. This is the process flow which corresponds to Fig. 4. Fig. 62 depicts the process flow starting from using as inputs the enterprise level data for each of the N.E. Corridor enterprise and the CA Coastal enterprise to the generation of Proportional PRV matrices for each of these enterprises and carrying forward the Net Risk Value for each of these enterprises. Finally, the process flow depicted in Fig. 63 begins with the Proportional PRV matrices generated as shown in Fig. 62 and using this data to generate an Amtrak Level 3 Asset Gradient Matrix as well as an Amtrak FOM. Additionally, the carried through Net Risk Value data for the two enterprises are summed to generate an Amtrak Net Risk Value. In what follows the overall process flow will be described with reference to Figs. 61 through 63, although the reader may wish to refer to the complete process flow as depicted in Fig. 60 to view the entire process synoptically.
As noted above, and as shown in Fig. 6, the Level 3 asset "Amtrak" can be divided into three enterprises, namely, the N. E. Corridor, Others, and CA Coastal. For purposes of the example depicted in Figs. 60 through 63, the Level 3 asset Amtrak will be considered to be composed of only the N.E. Corridor and CA Coastal enterprises. In turn, the N.E. Corridor enterprise is composed of two SIGs: 30th Street Station and Trenton Station. With reference to Fig. 61, beginning at the right side of the figure, data from the two SIGs comprising the N.E. Corridor can be entered and processed. Thus, at 6101 the 30th Street Station SIG Matrix (shown in Fig. 47) can be input and at 6107 the Trenton Station SIG Matrix (shown in Fig. 50) can be input. Each of these SIG Matrices can be processed in parallel. Thus, at 6102 the 30th Street Station RVF matrix can be generated, as shown in Fig. 50. Similarly, at 6006, the Trenton Station RVF matrix can be generated, as shown in Fig. 52. Finally, these two RVF Matrices can be transformed to Proportional RVF Matrices at 6103 and 6105, respectively, as shown in Figs. 51 and 53, respectively. The proportional RVF matrices 6103 and 6105 can be added cellwise at 6113 to generate a N.E. Corridor Consolidated RVF Matrix, shown in Fig. 54. It is the N.E. Corridor Consolidated RVF matrix from which the N.E. Corridor enterprise level Gradient matrix is generated at 6201. Prior to describing subsequent process flow at the enterprise level, the creation of FOMs will next be described. As noted, the 30th Street Station SIG Matrix is input at 6101. This matrix is shown in Fig. 47. This matrix can be transformed via the formula 1 -10 ce va ue to generate a 30l Street Station Build Coefficients matrix at 6110, as shown in Fig. 55. From that point, at 6111, a 30th Street Station FOM Intermediary Product can be generated, which is the product of all cells in the 30l Street Station Build Coefficients matrix. From the FOM Intermediary Product 6111 a 30th Street Station FOM can be generated at 6121 by the formula
FOM = -LOG(I-FOM Intermediary Product).
From the FOM, a 30l Street Station Net Risk Multiplier 6122 can be generated via the operation 10"FOM. Taking the product of the Net Risk Multiplier 6122 and the 30th Street Valuation Input 6112 generates a Net Risk Value 6123. In the case of the 30th Street Station, this is $1,860,667,446 (See Fig. 6).
Similarly, tracking the generation of FOM and Net Risk Value for the Trenton Station, beginning at 6106 the Trenton Station SIG matrix can be input, shown at Fig. 48. From this matrix, using the formula
1 -10"Ce" Value a Trenton Station Build Coefficients matrix can be generated at 6114, shown at Fig. 56. From there, by taking product of all cells in that matrix, a Trenton Station FOM Intermediary Product can be generated at 6115 which can be transformed to an FOM at 6125 using the formula
FOM = -LOG (1 -Trenton Station FOM Intermediary Product).
The Trenton Station FOM can then be transformed to a Net Risk Multiplier at 6126 using the formula 10" and the Net Risk Multiplier 6126 multiplied by the Trenton Station Valuation Input 6116 can thus yield an exemplary Trenton Station Net Risk Value 6127 of $5,674,000 (see Fig. 6).
Fig. 62 deals primarily with processing at the N.E. Corridor enterprise level. Beginning at 6201, the N.E. Corridor Enterprise Level Gradient Matrix, shown in Fig. 58, can be carried over from Fig. 61. At 6202, this ELGM can be transformed to a N.E. Corridor Consolidated Build Matrix, as shown in Fig. 57. At 6203 an N.E. Corridor FOM Intermediary Product can be generated, by multiplying all of the cells in the N.E. Corridor Consolidated Build Matrix (Fig. 57) together, and at 6211 a N.E. Corridor FOM can be formed from the FOM Intermediary Product 6203 using the equation
FOM = -LOG (1 - N.E. Corridor FOM Intermediary Product).
In parallel, the N.E. Corridor ELGM can be graphically displayed, as shown in Fig. 59, at 6210. Adding together the Net Risk Values from the 30th Street Station (6123, Fig. 61) and the Trenton Station (6127, Fig. 61), at 6212 a Net Risk Value for the N.E. Corridor enterprise can be generated. This can be input to the next levels the Level 3 Asset Amtrak, as described below. Finally, 6230 represents all of the data relative to the CA Coastal Enterprise that was carried through from lower level computations. The processing required to generate that data is not shown, it being assumed that there is enterprise level data for the CA Coastal enterprise which was generated in an analogous manner as shown for the N.E. Corridor enterprise. Thus, from 6230 CA Coastal Asset Waiting Factors can be input to 6224, a CA Coastal ELGM can be input to 6225, and a CA Coastal Net Risk Value can be input to 6331, on Fig. 63. What will next be described is the combination of the N.E. Corridor Enterprise data with the CA Coastal Enterprise data to generate Consolidated Amtrak data. First, for the N.E. Corridor data, beginning at 6220, the N.E. Corridor ELGM can be processed into a N.E. Corridor Real Value Factor Matrix at 6221. This matrix can in turn be transformed to a N.E. Corridor Proportional Real Value Factor Matrix at 6222. Similarly, at 6225, the CA Coastal ELGM can be processed into a CA Coastal Real Value Factor Matrix 6224, and that matrix can, in turn, be processed into a CA Coastal Proportional Real Value Factor Matrix at 6223. The two proportional Real Value Factor matrices at 6222 and 6223 can be respectively combined into an Amtrak Consolidated Real Value Factor Matrix at 6301, in Fig. 63. This matrix can, for example, be transformed into an Amtrak Level 3 Asset Gradient Matrix at 6310 using the equation
1-LOG(CeII Value), with the cell values of the Amtrak Consolidated Real Value Factor Matrix 6301 as inputs. This Amtrak Level 3 Gradient 6310 Matrix can be displayed graphically at 6320. Additionally, the Amtrak Level 3 Gradient Matrix can be transformed into an Amtrak Consolidated Build
/~. _ _£c: _: *„ x /T^-:.. .. - i n ..„:„ „ ^i *: Λ f|'Ce" va'ue v .U: _i A -^i. rm i v^υciiincms iviαuiΛ αi UJ i i uaing uic c^uαuυn ■ w mjiii wiiiv^n oil, .TUIiUcU- i \JI\I.
Intermediary Product can be generated at 6312 by multiplying all elements of this matrix together. Finally, at 6321 an Amtrak FOM can be generated from such an Amtrak FOM Intermediary Product 6312 using the equation:
Amtrak FOM = -LOG(I - Amtrak FOM Intermediary Product)
As noted above, in exemplary embodiments of the present invention, Net Risk Values for any net risk above the SIG levels can be calculated by linearly adding the relevant lower level Net Risk Values. Thus, Net Risk Values follow a different data flow, or data path, than do the Gradient matrix and FOM calculations. Accordingly, as described above, the N.E. Corridor Net Risk Value from 6212 in Fig. 62 can be input to Fig. 63 at 6330 and the CA Coastal Net Risk Value from 6230 in Fig. 62 can be input to Fig. 63 at 6331. These two values can, for example, then be summed to generate an Amtrak Net Risk Value at 6340. Thus, at the far left of Fig. 63, at the culmination of process flow for this example, there has been generated an Amtrak Level 3 Asset Gradient Matrix 6310 which can be displayed in the 3-D multicolored graphic 6320, an Amtrak FOM 6321 which is a measure of the risk vulnerability of the entire Amtrak system, and finally there is an Amtrak Net Risk Value 6340 (sometimes referred to as retained risk).
VII. EXEMPLARY INVERSE USE
In exemplary embodiments of the present invention there can also be an "inverse" use of a risk evaluation processes, as next described.
Normally, risk assessment is understood in the context of having one or more assets to protect that have intrinsic vulnerabilities and potential threats. In such contexts the objective is to lower the vulnerabilities in the face of potential threats to lower overall risk. However, in certain applications such as, for example, war gaming or counter-terrorism, an asset belongs to an enemy or aggressor. A user knows or strongly suspects its vulnerabilities and has no intention of lowering them. In fact he may seek to maximize them. In this context exemplary embodiments of the present invention can be used to analyze the effectiveness of the threats that may be generated. In such an exemplary application the data input is all the same, as described above, the data processing is the same, but the activity level to determine the input data to the threat matrices is very high and the vulnerability matrix activity level is significantly reduced. In a conventional Risk Mitigation Analysis, as described above, threats are nominally set as determined and the values fixed, i.e., they cannot be controlled, and the best that can be done is to not miss one. All remediation is done on the vulnerabilities to MINIMIZE the risk to an Asset, and the iterative risk remediation analysis of 470 (Fig. 4) can have lots of activity vis-a-vis vulnerability tables. However, in the inverse application, the opposite is true. Vulnerabilities can be known from assessing and spying on the opposition. What one varies in the game is the threat levels. The FOM improvement analysis 470 is all about trying different threats and seeing if that maximizes the hurt on the opposition's assets. Thus, Fig. 52 in such an embodiment an "improved" FOM is a smaller FOM, indicating a higher risk. ROI calculations can, for example, follow the same process, but the intention on the input of the data is vastly different. In an inverse embodiment, as ROI can be measured as the amount of risk AUGMENTATION divided by the costs of implementing a LOWER FOM. Hence the focus is on, and the activity level is at the Threat tables (matrices). Threat values can be as granular as desired since the methods of the present invention are indifferent to granularity. In such an inverse use, retained risk values can be maximized as opposed to minimized in the more conventional risk management applications as illustrated above.
While the present invention has been described with reference to certain exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. For example, the disclosed system and method can be used to analyze the risks of any asset or system of assets, where the concepts of "risk," "asset," and "asset value" are understood in the broadest possible sense. Thus, for example, the methods of the present invention could be equally applied to performing defensive/preventive retained risk evaluations of infrastructure in a homeland security context as well as to offensively looking for vulnerabilities to exploit in an enemy's military or economic assets. Alternatively, the methods of the present invention could also be applied to analyze the "risk" - in political terms, using some meaningful metric - that an opposing candidate faces in a political campaign comprising multiple candidates. For example, one might consider as vulnerabilities the weaknesses (education, record, finances, hidden information, dirt, cosmetics, speaking ability, knowledge, prior experience, etc.) expressed in likelihood of looking bad, happening or being revealed in the campaign or relative to the candidate, and for threats, the strengths (threats) of the opposition (debates, misrepresentations, dirty tricks, cosmetics, outspending, media, etc.) expressed in probability or likelihood of implementation in countering those vulnerabilities. Such an analysis can be done across several boundaries (assets) geographic or demographic, as well as considering the strengths and weaknesses of multiple candidates. In such an exemplary embodiment, the "what if or iterative risk remediation analysis can be used against different strategies to foil the effectiveness of the competition and/or raise the potential voting point separation between candidates (i.e., lower the retained risk).
In addition, many modifications may be made to adapt a particular context or valuation metric to the teachings of the present invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.

Claims

WHAT IS CLAIMED:
1. A method for evaluating risks, comprising: dividing an asset into at least one level of sub-assets via a top-down analysis; assigning an asset value to the asset and each sub-asset; entering data representative of threat and vulnerability probabilities for each lowest level sub-asset's constitutive parts, wherein said data includes threat and vulnerability probabilities for each of a set of substantially comprehensive and independent categories; combining said data to generate an overall risk score and an output threat- vulnerability matrix for the asset and each sub-asset, and a net risk value for the asset and each sub-asset.
2. The method of claim 1, wherein the set of substantially comprehensive and independent categories is structured so as to accommodate additional independent categories as may be desired.
3. The method of claim 1, wherein said threat data is expressed as probabilities ranging from 0 to 1 in fixed point numbers.
4. The method of claim 1 , wherein said vulnerability probabilities are expressed as fixed point numbers on a scale of 1 to N, wherein said numbers represent negative powers often.
5. The method of claim 3, wherein the number of significant figures can be increased to increase the granularity of said threat probabilities.
6. The method of claim 4, wherein N can be increased to increase the range of data values for vulnerability probabilities.
7. The method of claim 4, wherein the number of significant figures for N can be increased to increase the granularity of said vulnerability data.
8. The method of claim 1 wherein said data representative of vulnerability probabilities is based upon independent measures.
9. The method of claim 8 wherein said independent measures include at least one of actuarial and handbook failure data.
10. The method of claim 1 wherein said data representative of threat probabilities is based on independent measures from threat evaluation sources.
11. The method of claim 1 , wherein said overall risk score does not average out failures.
12. The method of claim 1 , further comprising displaying said output threat- vulnerability matrix in at least one of 2-dimensional "hot spot" plots and 3 dimensional surface plots.
13. The method of claim 12, wherein said plots depict areas most likely to warrant risk remediation and portray the risk profile in a nominal visual average similar to an overall risk score value.
14. The method of claim 1, wherein said combining said data to generate an overall risk score and output threat-vulnerability matrix for all asset levels above a lowest sub- asset level can be performed solely using the output data of the immediately lower level, without any need for the original input data.
15. The method of claim 1 , wherein output data is insulated from input source data so that objective numerical results can be compounded without revealing the risks of individual contributors in non-federated analyses.
16. The method of claim 15, wherein said non-federated analyses relate to an asset or sector one or more of whose lower level sub-assets are sovereign entities.
17. The method of claim 1 , wherein the risk sought to be analyzed is that retained by an adverse party.
18. The method of claim 17, wherein said analysis is designed to augment, maximize and/or exploit said risk.
19. The method of claim 1, wherein said asset value and net risk value are expressed in one of dollars or other currency, human lives, a quality of life valuation metric, and a strategic value valuation metric.
20. The method of claim 1, wherein for each asset and non lowest level sub-asset the overall risk score is generated by mathematically operating upon a mathematical transformation of the output threat- vulnerability matrix from a lower level sub-asset, and for a lowest level sub-asset it is generated from said lowest level sub-asset's constituent parts..
21. The method of claim 1 , wherein for each asset and non lowest level sub-asset the output threat-vulnerability matrix is generated from a mathematical combination of mathematical transformations of output threat- vulnerability matrices of all component sub-assets from the immediately lower asset level, and for a lowest level sub-asset it is generated from said lowest level sub-asset's constituent parts.
22. The method of claim 1, wherein for each asset and non lowest level sub-asset the net risk value is generated by a linear combination of the net risk values of all component sub-assets from the immediately lower asset level and for a lowest level sub-asset it is generated from said lowest level sub-asset's constituent parts.
23. The method of claim 1, wherein said lowest level sub-asset's constitutive parts comprise at least one components, and wherein said components each comprise at least one elements.
24. The method of claim 1, wherein said data representative of threat and vulnerability probabilities for each component and element is weighted at either or both the elemental and componental levels.
25. The method of claim 1 , wherein said threat and vulnerability probabilities for each of a set of substantially comprehensive and independent categories is input by at least one of a user, a risk analyst and a computer program.
26. The method of claim 1, wherein upon obtaining said overall risk score for any asset or sub-asset an iterative risk remediation process can be performed.
27. The method of claim 26, wherein said iterative risk remediation process comprises:
selecting input vulnerabilities that are below a defined acceptable level; calculating the costs of remediating said vulnerabilities to an acceptable level; comparing said remediation costs with the diminution in net risk value for the asset as a result of remediating said vulnerabilities to calculate a return on investment; remediating the vulnerabilities if the net risk exceeds a defined acceptable value and said return on investment exceeds a defined value.
28. The method of claim 27, further comprising iterating said selecting, calculating, comparing and remediating until a diminishing returns point is reached.
29. The method of claim 1 , wherein upon obtaining said overall risk score for any asset or sub-asset an iterative risk evaluation process can be performed to maximize risk to an asset by varying threats.
30. The method of claim 29, wherein said iterative risk evaluation process comprises: selecting input threat values that are below a defined acceptable level; calculating the costs of elevating said threats to an acceptable level; comparing said elevation costs with the enhancement in net risk value for the asset as a result of elevating said threats to calculate a return on investment; elevating the threats if the Net Risk does not exceed a defined value and said return on investment is below a defined value; iterating said selecting, calculating, comparing and elevating until a diminishing returns point is reached.
31. A method for evaluating risks, comprising: dividing an asset into one or more sub-assets via a top-down analysis; dividing each lowest level sub-asset into one or more special interest groups (SIGs), each SIG into one or more components and each component into one or more elements; assigning an asset value to each asset, sub-asset and SIG; entering data representative of threat and vulnerability probabilities for each element of each component, wherein said data comprises a 12 x 6 threat probability and vulnerability matrix for each element and a relative importance weighting for each element and component; and for each SIG: for each component: transforming said elemental data into build coefficient matrices; and combining the elemental build coefficient matrices to generate a combined component level build coefficient matrix; combining the combined component build coefficient matrices to generate a SIG level build coefficient matrix; transforming the SIG level build coefficient matrix to generate a SIG matrix and a Figure of Merit (FOM); using the FOM to calculate SIG retained risk.
32. The method of claim 31 , further comprising displaying each SIG matrix graphically in at least one of a 3D multicolored, 2D contour plot and 2D hot spot plot.
3. The method of claim 32, further comprising: for all SIGs in an enterprise: transforming the SIG matrices to SIG real value factor (RVF) matrices; combining the SIG RVF matrices by multiplying each SIG RVF matrix by a weighting factor and summing to generate an enterprise level matrix; and summing SIG net risk values to generate an enterprise level net risk.
34. The method of claim 33, further comprising: for all Level N sub-assets in a Level N-I sub-asset: combining Level N gradient matrices by multiplying each Level N gradient matrix by a weighting factor and summing to generate Level N-I asset level gradient matrices; and summing Level N net risk values to generate Level N-I level net risk.
35. A computer program product comprising a computer usable medium having computer readable program code means embodied therein, the computer readable program code means in said computer program product comprising means for causing a computer to: divide an asset into at least one level of sub-assets via a top-down analysis; assign an asset value to the asset and each sub-asset; enter data representative of threat and vulnerability probabilities for each lowest level sub-asset's constitutive parts, wherein said data includes threat and vulnerability probabilities for each of a set of substantially comprehensive and independent categories; and combine said data to generate an overall risk score, an output threat- vulnerability matrix, and a net risk value for the asset and each sub-asset.
36. The computer program product of claim 35, wherein the set of substantially comprehensive and independent categories is structured so as to accommodate additional independent categories as may be desired.
37. A computer program product comprising a computer usable medium having computer readable program code means embodied therein, the computer readable program code means in said computer program product comprising means for causing a computer to: divide an asset into one or more sub-assets via a top-down analysis; divide each lowest level sub-asset into one or more special interest groups (SIGs), each SIG into one or more components and each component into one or more elements; assign an asset value to each asset, sub-asset and SIG; enter data representative of threat and vulnerability probabilities for each element of each component, wherein said data comprises a 12 x 6 threat probability and vulnerability matrix for each element and a relative importance weighting for each element and component; and for each SIG to: uuiicxuiiii _.««J elemental data into build coefficient matrices; combining the elemental build coefficient matrices to generate a combined component level build coefficient matrix; combine the combined component build coefficient matrices to generate a
SIG level build coefficient matrix; transform the SIG level build coefficient matrix to generate a SIG matrix and a Figure of Merit (FOM); and use the FOM to calculate SIG retained risk.
38. The computer program product of claim 37, the computer readable program code means in said computer program product further comprising means for causing a computer to display each SIG matrix graphically in at least one of a 3D multicolored, 2D contour plot and 2D hot spot plot.
39. The computer program product of claim 38, the computer readable program code means in said computer program product further comprising means for causing a computer to: for all SIGs in an enterprise: combine the SIG matrices by multiplying each SIG matrix by a weighting factor and summing to generate enterprise level gradient matrices; and sum SIG net risk values to generate enterprise level net risk.
0. The computer program product of claim 39, the computer readable program code means in said computer program product further comprising means for causing a computer to: for all Level N sub-assets in a Level N-I sub-asset: combine Level N gradient matrices by multiplying each Level N gradient matrix by a weighting factor and summing to generate Level N-I asset level gradient matrices; and sum Level N net risk values to generate Level N-I level net risk.
PCT/US2006/044228 2005-11-15 2006-11-14 Systems and methods for identifying, categorizing, quantifying and evaluating risks WO2008054403A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US28060505A 2005-11-15 2005-11-15
US11/280,605 2005-11-15

Publications (3)

Publication Number Publication Date
WO2008054403A2 true WO2008054403A2 (en) 2008-05-08
WO2008054403A9 WO2008054403A9 (en) 2008-07-10
WO2008054403A3 WO2008054403A3 (en) 2008-10-09

Family

ID=39344757

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/044228 WO2008054403A2 (en) 2005-11-15 2006-11-14 Systems and methods for identifying, categorizing, quantifying and evaluating risks

Country Status (1)

Country Link
WO (1) WO2008054403A2 (en)

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009117518A1 (en) * 2008-03-19 2009-09-24 Experian Information Solutions, Inc. System and method for tracking and analyzing loans involved in asset-backed securities
US20140188549A1 (en) * 2012-12-28 2014-07-03 Eni S.P.A. Risk assessment method and system for the security of an industrial installation
WO2014205497A1 (en) * 2013-06-26 2014-12-31 Climate Risk Pty Ltd Computer implemented frameworks and methodologies for enabling climate change related risk analysis
US8954459B1 (en) 2008-06-26 2015-02-10 Experian Marketing Solutions, Inc. Systems and methods for providing an integrated identifier
US8966649B2 (en) 2009-05-11 2015-02-24 Experian Marketing Solutions, Inc. Systems and methods for providing anonymized user profile data
US8972400B1 (en) 2013-03-11 2015-03-03 Consumerinfo.Com, Inc. Profile data management
US9147042B1 (en) 2010-11-22 2015-09-29 Experian Information Solutions, Inc. Systems and methods for data verification
US9152727B1 (en) 2010-08-23 2015-10-06 Experian Marketing Solutions, Inc. Systems and methods for processing consumer information for targeted marketing applications
US9508092B1 (en) 2007-01-31 2016-11-29 Experian Information Solutions, Inc. Systems and methods for providing a direct marketing campaign planning environment
US9563916B1 (en) 2006-10-05 2017-02-07 Experian Information Solutions, Inc. System and method for generating a finance attribute from tradeline data
US9619579B1 (en) 2007-01-31 2017-04-11 Experian Information Solutions, Inc. System and method for providing an aggregation tool
US9654541B1 (en) 2012-11-12 2017-05-16 Consumerinfo.Com, Inc. Aggregating user web browsing data
US9652802B1 (en) 2010-03-24 2017-05-16 Consumerinfo.Com, Inc. Indirect monitoring and reporting of a user's credit data
US9697263B1 (en) 2013-03-04 2017-07-04 Experian Information Solutions, Inc. Consumer data request fulfillment system
US9972048B1 (en) 2011-10-13 2018-05-15 Consumerinfo.Com, Inc. Debt services candidate locator
US10102536B1 (en) 2013-11-15 2018-10-16 Experian Information Solutions, Inc. Micro-geographic aggregation system
US10242019B1 (en) 2014-12-19 2019-03-26 Experian Information Solutions, Inc. User behavior segmentation using latent topic detection
US10255598B1 (en) 2012-12-06 2019-04-09 Consumerinfo.Com, Inc. Credit card account data extraction
US10262362B1 (en) 2014-02-14 2019-04-16 Experian Information Solutions, Inc. Automatic generation of code for attributes
US10339527B1 (en) 2014-10-31 2019-07-02 Experian Information Solutions, Inc. System and architecture for electronic fraud detection
US10380654B2 (en) 2006-08-17 2019-08-13 Experian Information Solutions, Inc. System and method for providing a score for a used vehicle
CN110298077A (en) * 2019-05-27 2019-10-01 中国汽车技术研究中心有限公司 The safe TARA analysis method of automobile information and digitization modeling system
US10437895B2 (en) 2007-03-30 2019-10-08 Consumerinfo.Com, Inc. Systems and methods for data verification
US10586279B1 (en) 2004-09-22 2020-03-10 Experian Information Solutions, Inc. Automated analysis of data to generate prospect notifications based on trigger events
US10593004B2 (en) 2011-02-18 2020-03-17 Csidentity Corporation System and methods for identifying compromised personally identifiable information on the internet
US10592982B2 (en) 2013-03-14 2020-03-17 Csidentity Corporation System and method for identifying related credit inquiries
US10699028B1 (en) 2017-09-28 2020-06-30 Csidentity Corporation Identity security architecture systems and methods
CN112235253A (en) * 2020-09-22 2021-01-15 杭州安恒信息技术股份有限公司 Data asset combing method and device, computer equipment and storage medium
US10896472B1 (en) 2017-11-14 2021-01-19 Csidentity Corporation Security and identity verification system and architecture
US10963434B1 (en) 2018-09-07 2021-03-30 Experian Information Solutions, Inc. Data architecture for supporting multiple search models
US11030562B1 (en) 2011-10-31 2021-06-08 Consumerinfo.Com, Inc. Pre-data breach monitoring
US11151468B1 (en) 2015-07-02 2021-10-19 Experian Information Solutions, Inc. Behavior analysis using distributed representations of event data
US11227001B2 (en) 2017-01-31 2022-01-18 Experian Information Solutions, Inc. Massive scale heterogeneous data ingestion and user resolution
CN114781937A (en) * 2022-06-20 2022-07-22 华网领业(杭州)软件有限公司 Method and device for pre-paid card enterprise risk early warning and storage medium
CN114969658A (en) * 2022-05-09 2022-08-30 中国人民解放军海军工程大学 Grouping sequential test method for exponential life type product
WO2023056259A1 (en) * 2021-09-29 2023-04-06 Bit Discovery, Inc. Asset inventorying system with in-context asset valuation prioritization
US11880377B1 (en) 2021-03-26 2024-01-23 Experian Information Solutions, Inc. Systems and methods for entity resolution
US11941065B1 (en) 2019-09-13 2024-03-26 Experian Information Solutions, Inc. Single identifier platform for storing entity data
US11954731B2 (en) 2023-03-06 2024-04-09 Experian Information Solutions, Inc. System and method for generating a finance attribute from tradeline data

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9529851B1 (en) 2013-12-02 2016-12-27 Experian Information Solutions, Inc. Server architecture for electronic data quality processing

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030093347A1 (en) * 2000-03-15 2003-05-15 Gray Dale F. Managing risk using macro-financial risk analysis
US20050240641A1 (en) * 2003-05-09 2005-10-27 Fujitsu Limited Method for predicting and avoiding danger in execution environment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030093347A1 (en) * 2000-03-15 2003-05-15 Gray Dale F. Managing risk using macro-financial risk analysis
US20050240641A1 (en) * 2003-05-09 2005-10-27 Fujitsu Limited Method for predicting and avoiding danger in execution environment

Cited By (82)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11373261B1 (en) 2004-09-22 2022-06-28 Experian Information Solutions, Inc. Automated analysis of data to generate prospect notifications based on trigger events
US11861756B1 (en) 2004-09-22 2024-01-02 Experian Information Solutions, Inc. Automated analysis of data to generate prospect notifications based on trigger events
US10586279B1 (en) 2004-09-22 2020-03-10 Experian Information Solutions, Inc. Automated analysis of data to generate prospect notifications based on trigger events
US11562457B2 (en) 2004-09-22 2023-01-24 Experian Information Solutions, Inc. Automated analysis of data to generate prospect notifications based on trigger events
US10380654B2 (en) 2006-08-17 2019-08-13 Experian Information Solutions, Inc. System and method for providing a score for a used vehicle
US11257126B2 (en) 2006-08-17 2022-02-22 Experian Information Solutions, Inc. System and method for providing a score for a used vehicle
US9563916B1 (en) 2006-10-05 2017-02-07 Experian Information Solutions, Inc. System and method for generating a finance attribute from tradeline data
US10963961B1 (en) 2006-10-05 2021-03-30 Experian Information Solutions, Inc. System and method for generating a finance attribute from tradeline data
US11631129B1 (en) 2006-10-05 2023-04-18 Experian Information Solutions, Inc System and method for generating a finance attribute from tradeline data
US10121194B1 (en) 2006-10-05 2018-11-06 Experian Information Solutions, Inc. System and method for generating a finance attribute from tradeline data
US10692105B1 (en) 2007-01-31 2020-06-23 Experian Information Solutions, Inc. Systems and methods for providing a direct marketing campaign planning environment
US11176570B1 (en) 2007-01-31 2021-11-16 Experian Information Solutions, Inc. Systems and methods for providing a direct marketing campaign planning environment
US9619579B1 (en) 2007-01-31 2017-04-11 Experian Information Solutions, Inc. System and method for providing an aggregation tool
US10311466B1 (en) 2007-01-31 2019-06-04 Experian Information Solutions, Inc. Systems and methods for providing a direct marketing campaign planning environment
US10891691B2 (en) 2007-01-31 2021-01-12 Experian Information Solutions, Inc. System and method for providing an aggregation tool
US11443373B2 (en) 2007-01-31 2022-09-13 Experian Information Solutions, Inc. System and method for providing an aggregation tool
US10650449B2 (en) 2007-01-31 2020-05-12 Experian Information Solutions, Inc. System and method for providing an aggregation tool
US9916596B1 (en) 2007-01-31 2018-03-13 Experian Information Solutions, Inc. Systems and methods for providing a direct marketing campaign planning environment
US11803873B1 (en) 2007-01-31 2023-10-31 Experian Information Solutions, Inc. Systems and methods for providing a direct marketing campaign planning environment
US10402901B2 (en) 2007-01-31 2019-09-03 Experian Information Solutions, Inc. System and method for providing an aggregation tool
US10078868B1 (en) 2007-01-31 2018-09-18 Experian Information Solutions, Inc. System and method for providing an aggregation tool
US11908005B2 (en) 2007-01-31 2024-02-20 Experian Information Solutions, Inc. System and method for providing an aggregation tool
US9508092B1 (en) 2007-01-31 2016-11-29 Experian Information Solutions, Inc. Systems and methods for providing a direct marketing campaign planning environment
US11308170B2 (en) 2007-03-30 2022-04-19 Consumerinfo.Com, Inc. Systems and methods for data verification
US10437895B2 (en) 2007-03-30 2019-10-08 Consumerinfo.Com, Inc. Systems and methods for data verification
WO2009117518A1 (en) * 2008-03-19 2009-09-24 Experian Information Solutions, Inc. System and method for tracking and analyzing loans involved in asset-backed securities
US10075446B2 (en) 2008-06-26 2018-09-11 Experian Marketing Solutions, Inc. Systems and methods for providing an integrated identifier
US11157872B2 (en) 2008-06-26 2021-10-26 Experian Marketing Solutions, Llc Systems and methods for providing an integrated identifier
US11769112B2 (en) 2008-06-26 2023-09-26 Experian Marketing Solutions, Llc Systems and methods for providing an integrated identifier
US8954459B1 (en) 2008-06-26 2015-02-10 Experian Marketing Solutions, Inc. Systems and methods for providing an integrated identifier
US9595051B2 (en) 2009-05-11 2017-03-14 Experian Marketing Solutions, Inc. Systems and methods for providing anonymized user profile data
US8966649B2 (en) 2009-05-11 2015-02-24 Experian Marketing Solutions, Inc. Systems and methods for providing anonymized user profile data
US9652802B1 (en) 2010-03-24 2017-05-16 Consumerinfo.Com, Inc. Indirect monitoring and reporting of a user's credit data
US10909617B2 (en) 2010-03-24 2021-02-02 Consumerinfo.Com, Inc. Indirect monitoring and reporting of a user's credit data
US9152727B1 (en) 2010-08-23 2015-10-06 Experian Marketing Solutions, Inc. Systems and methods for processing consumer information for targeted marketing applications
US9147042B1 (en) 2010-11-22 2015-09-29 Experian Information Solutions, Inc. Systems and methods for data verification
US9684905B1 (en) 2010-11-22 2017-06-20 Experian Information Solutions, Inc. Systems and methods for data verification
US10593004B2 (en) 2011-02-18 2020-03-17 Csidentity Corporation System and methods for identifying compromised personally identifiable information on the internet
US9972048B1 (en) 2011-10-13 2018-05-15 Consumerinfo.Com, Inc. Debt services candidate locator
US11200620B2 (en) 2011-10-13 2021-12-14 Consumerinfo.Com, Inc. Debt services candidate locator
US11568348B1 (en) 2011-10-31 2023-01-31 Consumerinfo.Com, Inc. Pre-data breach monitoring
US11030562B1 (en) 2011-10-31 2021-06-08 Consumerinfo.Com, Inc. Pre-data breach monitoring
US11012491B1 (en) 2012-11-12 2021-05-18 ConsumerInfor.com, Inc. Aggregating user web browsing data
US11863310B1 (en) 2012-11-12 2024-01-02 Consumerinfo.Com, Inc. Aggregating user web browsing data
US10277659B1 (en) 2012-11-12 2019-04-30 Consumerinfo.Com, Inc. Aggregating user web browsing data
US9654541B1 (en) 2012-11-12 2017-05-16 Consumerinfo.Com, Inc. Aggregating user web browsing data
US10255598B1 (en) 2012-12-06 2019-04-09 Consumerinfo.Com, Inc. Credit card account data extraction
US20140188549A1 (en) * 2012-12-28 2014-07-03 Eni S.P.A. Risk assessment method and system for the security of an industrial installation
US9697263B1 (en) 2013-03-04 2017-07-04 Experian Information Solutions, Inc. Consumer data request fulfillment system
US8972400B1 (en) 2013-03-11 2015-03-03 Consumerinfo.Com, Inc. Profile data management
US10592982B2 (en) 2013-03-14 2020-03-17 Csidentity Corporation System and method for identifying related credit inquiries
WO2014205496A1 (en) * 2013-06-26 2014-12-31 Climate Risk Pty Ltd Computer implemented frameworks and methodologies for enabling risk analysis for a system comprising physical assets
WO2014205497A1 (en) * 2013-06-26 2014-12-31 Climate Risk Pty Ltd Computer implemented frameworks and methodologies for enabling climate change related risk analysis
US10580025B2 (en) 2013-11-15 2020-03-03 Experian Information Solutions, Inc. Micro-geographic aggregation system
US10102536B1 (en) 2013-11-15 2018-10-16 Experian Information Solutions, Inc. Micro-geographic aggregation system
US11847693B1 (en) 2014-02-14 2023-12-19 Experian Information Solutions, Inc. Automatic generation of code for attributes
US10262362B1 (en) 2014-02-14 2019-04-16 Experian Information Solutions, Inc. Automatic generation of code for attributes
US11107158B1 (en) 2014-02-14 2021-08-31 Experian Information Solutions, Inc. Automatic generation of code for attributes
US11436606B1 (en) 2014-10-31 2022-09-06 Experian Information Solutions, Inc. System and architecture for electronic fraud detection
US10339527B1 (en) 2014-10-31 2019-07-02 Experian Information Solutions, Inc. System and architecture for electronic fraud detection
US11941635B1 (en) 2014-10-31 2024-03-26 Experian Information Solutions, Inc. System and architecture for electronic fraud detection
US10990979B1 (en) 2014-10-31 2021-04-27 Experian Information Solutions, Inc. System and architecture for electronic fraud detection
US10445152B1 (en) 2014-12-19 2019-10-15 Experian Information Solutions, Inc. Systems and methods for dynamic report generation based on automatic modeling of complex data structures
US11010345B1 (en) 2014-12-19 2021-05-18 Experian Information Solutions, Inc. User behavior segmentation using latent topic detection
US10242019B1 (en) 2014-12-19 2019-03-26 Experian Information Solutions, Inc. User behavior segmentation using latent topic detection
US11151468B1 (en) 2015-07-02 2021-10-19 Experian Information Solutions, Inc. Behavior analysis using distributed representations of event data
US11681733B2 (en) 2017-01-31 2023-06-20 Experian Information Solutions, Inc. Massive scale heterogeneous data ingestion and user resolution
US11227001B2 (en) 2017-01-31 2022-01-18 Experian Information Solutions, Inc. Massive scale heterogeneous data ingestion and user resolution
US11157650B1 (en) 2017-09-28 2021-10-26 Csidentity Corporation Identity security architecture systems and methods
US11580259B1 (en) 2017-09-28 2023-02-14 Csidentity Corporation Identity security architecture systems and methods
US10699028B1 (en) 2017-09-28 2020-06-30 Csidentity Corporation Identity security architecture systems and methods
US10896472B1 (en) 2017-11-14 2021-01-19 Csidentity Corporation Security and identity verification system and architecture
US10963434B1 (en) 2018-09-07 2021-03-30 Experian Information Solutions, Inc. Data architecture for supporting multiple search models
US11734234B1 (en) 2018-09-07 2023-08-22 Experian Information Solutions, Inc. Data architecture for supporting multiple search models
CN110298077A (en) * 2019-05-27 2019-10-01 中国汽车技术研究中心有限公司 The safe TARA analysis method of automobile information and digitization modeling system
US11941065B1 (en) 2019-09-13 2024-03-26 Experian Information Solutions, Inc. Single identifier platform for storing entity data
CN112235253A (en) * 2020-09-22 2021-01-15 杭州安恒信息技术股份有限公司 Data asset combing method and device, computer equipment and storage medium
US11880377B1 (en) 2021-03-26 2024-01-23 Experian Information Solutions, Inc. Systems and methods for entity resolution
WO2023056259A1 (en) * 2021-09-29 2023-04-06 Bit Discovery, Inc. Asset inventorying system with in-context asset valuation prioritization
CN114969658A (en) * 2022-05-09 2022-08-30 中国人民解放军海军工程大学 Grouping sequential test method for exponential life type product
CN114781937A (en) * 2022-06-20 2022-07-22 华网领业(杭州)软件有限公司 Method and device for pre-paid card enterprise risk early warning and storage medium
US11954731B2 (en) 2023-03-06 2024-04-09 Experian Information Solutions, Inc. System and method for generating a finance attribute from tradeline data

Also Published As

Publication number Publication date
WO2008054403A9 (en) 2008-07-10
WO2008054403A3 (en) 2008-10-09

Similar Documents

Publication Publication Date Title
WO2008054403A2 (en) Systems and methods for identifying, categorizing, quantifying and evaluating risks
Argomaniz et al. A decade of EU counter-terrorism and intelligence: A critical assessment
Voo et al. National cyber power index 2020: Methodology and analytical considerations
Dupont The cyber-resilience of financial institutions: significance and applicability
Hausken Cyber resilience in firms, organizations and societies
Brown et al. Analyzing the vulnerability of critical infrastructure to attack and planning defenses
Dreyer et al. Estimating the global cost of cyber risk
Simha et al. Straight from the horse’s mouth: Auditors’ on fraud detection and prevention, roles of technology, and white-collars getting splattered with red!
Rahman et al. Assessing cyber resilience of additive manufacturing supply chain leveraging data fusion technique: A model to generate cyber resilience index of a supply chain
CN102148820A (en) System and method for estimating network security situation based on index logarithm analysis
Atkins et al. An improvised patchwork: success and failure in cybersecurity policy for critical infrastructure
Appiah et al. Organizational architecture, resilience, and cyberattacks
Sukumar et al. Cyber risk assessment in small and medium‐sized enterprises: A multilevel decision‐making approach for small e‐tailors
Lin et al. Risk-Based V. Compliance-Based Utility Cybersecurity-a False Dichotomy
Pérez-Morón Eleven years of cyberattacks on Chinese supply chains in an era of cyber warfare, a review and future research agenda
Ros The making of a cyber crash: a conceptual model for systemic risk in the financial sector
Rasi et al. A literature review on blockchain technology: risk in supply chain management
Kujawski et al. Quantitative risk‐based analysis for military counterterrorism systems
Smith Mission dependency index of air force built infrastructure: Knowledge discovery with machine learning
Shahpasand et al. A comprehensive security control selection model for inter-dependent organizational assets structure
Ramamoorti et al. The pervasive impact of information technology on internal auditing
Kester et al. Crime predictive model in cybercrime based on social and economic factors using the Bayesian and Markov theories
Pamela et al. The Philippines' Cybersecurity Strategy: Strengthening partnerships to enhance cybersecurity capability
Trierweiler IT-based Fraud Management Approaches in Small and Medium Enterprises–A Multivocal Literature Review
Rahaman Recent advancement of cyber security: Challenges and future trends in Bangladesh

Legal Events

Date Code Title Description
NENP Non-entry into the national phase in:

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 06851759

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 06851759

Country of ref document: EP

Kind code of ref document: A2