WO2014176018A1 - Computerized indexing of catastrophic operational risk readiness - Google Patents

Computerized indexing of catastrophic operational risk readiness Download PDF

Info

Publication number
WO2014176018A1
WO2014176018A1 PCT/US2014/033079 US2014033079W WO2014176018A1 WO 2014176018 A1 WO2014176018 A1 WO 2014176018A1 US 2014033079 W US2014033079 W US 2014033079W WO 2014176018 A1 WO2014176018 A1 WO 2014176018A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
criterion
sub
insurer
factor analytic
Prior art date
Application number
PCT/US2014/033079
Other languages
French (fr)
Inventor
Martin C. HODELL
John Keane
Original Assignee
Mwh Americas Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mwh Americas Inc. filed Critical Mwh Americas Inc.
Publication of WO2014176018A1 publication Critical patent/WO2014176018A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/08Insurance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems

Definitions

  • the present disclosure is directed to the field of evaluating and addressing catastrophe operational risk readiness, and more specifically, to systems and methods to define and quantify a non-life insurer's preparedness to deal with catastrophic events characterized by high consequence and low probability.
  • Non-life insurers around the world are facing escalating costs of natural catastrophes. Furthermore, several areas exist in which non-life insurers appear to be systematically underprepared for catastrophic events, which may result in significant risks.
  • the large percentage of realized operational risk in the Hurricane Katrina, Deepwater Horizon spill, and other catastrophes has led the Banking, Finance and Insurance Industry to recognize the importance of operational risk management processes.
  • the present disclosure recognizes that by undertaking prudent pre-event steps, insurers may significantly reduce the costs of recovery in the event of a catastrophic event. Further, insurers and re-insurers have systematically underestimated the cost of catastrophe-driven operational risk, hence the relative merit of mitigating that risk has been understated.
  • an object of the inventive technology is to provide a computerized framework for generating a catastrophe operational risk readiness index comprising: a risk management database for storing a plurality of insurance preparedness foundation criteria, a plurality of insurance preparedness enabler criteria, and an insurer risk readiness question set for each said criterion; an insurer question interface responsive to said risk management database for posing each question of each said insurer risk readiness question set to an insurer; an insurer response interface for receiving responses to each said posed question from said insurer; an evaluative response scoring module responsive to said insurer response interface for assigning an evaluative response score to each said insurer response; an evaluative criterion scoring module responsive to said evaluative response scoring module for assigning an evaluative criterion score to each said criterion; an evaluative index scoring module responsive to said evaluative criterion scoring module for assigning a catastrophe operational risk readiness index for said insurer; an output display for displaying said catastrophe operational risk readiness index.
  • an object of the inventive technology is to provide a computer architecture for insurer response validity assessment in the determination of a catastrophe operational risk readiness index comprising: a risk management database for storing a plurality of substantive risk readiness questions and a plurality of evidentiary validity assessment questions correlated to said substantive risk readiness questions; an insurer question interface responsive to said risk management database for posing each said substantive risk readiness question and each said evidentiary validity assessment question to said insurer; an insurer response interface for receiving substantive risk readiness responses and evidentiary validity assessment responses from said insurer; an evaluative substantive risk readiness response scoring module responsive to said insurer response interface for assigning a substantive risk readiness question score to each said insurer substantive risk readiness response; an evaluative evidentiary validity assessment response scoring module responsive to said insurer response interface for assigning an evidentiary validity assessment score to each said insurer evidentiary validity assessment response; a validity assessment module for utilizing said substantive risk readiness question score and said evidentiary validity assessment score to assess insurer response validity; an output display for displaying said insurer response validity assessment.
  • an object of the inventive technology is to provide a method for evaluating preparedness of an organization for a catastrophic event, comprising: providing a user with a catastrophe operational risk readiness (CORR) index assessment question set, the question set including a plurality of questions related to one or more criteria and/or attributes of the organization; receiving responses from the user to the question set; compiling and scoring the responses; and determining component and overall scores for CORR based on the response scores and predetermined weighting associated with the one or more criteria and/or attributes.
  • CORR catastrophe operational risk readiness
  • Fig. 1 is a schematic representation depicting a conceptual framework for generating a catastrophe operation risk readiness index in one exemplary embodiment.
  • Fig. 2 is a schematic representation of foundational criteria and enabler criteria in one exemplary embodiment.
  • Fig. 3 is a schematic representation of an evaluative index scoring module methodology in one exemplary embodiment.
  • Fig. 4 is a schematic representation of an evaluative criterion scoring module methodology in one exemplary embodiment.
  • Fig. 5 is a schematic representation of an evaluative response scoring module methodology in one exemplary embodiment.
  • Fig. 6 is a schematic representation of a computer system for evaluating preparedness of an organization for a catastrophic event in one exemplary embodiment.
  • Fig. 7 is a schematic representation of a computerized framework for generating a catastrophe operational risk readiness index in one exemplary embodiment.
  • Fig. 8 is a schematic representation of a computer architecture for insurer response validity assessment in one exemplary embodiment.
  • the present inventive technology includes a variety of aspects, which may be combined in different ways.
  • the following descriptions are provided to list elements and describe some of the embodiments of the present inventive technology. These elements are listed with initial embodiments, however it should be understood that they may be combined in any manner and in any number to create additional embodiments.
  • the variously described examples and preferred embodiments should not be construed to limit the present inventive technology to only the explicitly described systems, techniques, and applications. Further, this description should be understood to support and encompass descriptions and claims of all the various embodiments, systems, techniques, methods, devices, and applications with any number of the disclosed elements, with each element alone, and also with any and all various permutations and combinations of all elements in this or any subsequent application.
  • the present disclosure provides a Catastrophe Operational Risk Readiness (CORR) Index, which is a systematic framework of criteria and attributes that identify strengths and weaknesses of Non-life insurers with respect to operational risk following a low probability, high consequence catastrophic event.
  • CORR Index provides a standard scoring approach that enables insurers, reinsurers, and regulators to evaluate risk readiness. By following the CORR Approach, an insurer can identify areas for improvement in preparation for a major catastrophe prior to the impact of the event itself.
  • the framework includes the following:
  • the present disclosure recognizes that existing approaches/tools are deficient for a number of items, such as defining the elements of operational risk management in the context of low probability, high consequence catastrophic events- and explain which matter more and why such elements matter more, and quantifying an insurer's maturity in catastrophe operational risk management.
  • an entity may perform catastrophe operational risk readiness assessment and initiate an action plan to mitigate one or more identified risks.
  • Various aspects of the disclosure provide a CORRI that enables an entity to (1) focus on operational risk- versus other classes of insurance risk; (2) focus on catastrophe preparedness in particular; and (3) Index composition and scoring mechanics.
  • operational risk may be a primary focus, and is one of a number of overall risks present in non- life insurers.
  • Such overall risks may be categorized into classes of risk, including: Insurance Risk; Market Risk; Credit Risk; Liquidity Risk; Strategic Risk; Operational Risk.
  • non-life insurers are exposed to a number of risks in responding to low probability, high consequence events, such as Claims Management; Information Management; Stakeholder Engagement; and Supply Chain Viability, to name a few examples.
  • Much of the operational risk management attention is directed at assessing business continuity management (i.e., ensuring the insurer could operate in the event of some catastrophe) whereas the focus of the present disclosure is on the underwritten risks themselves and how the insurer improves operational readiness and ultimately operational performance.
  • a focus of various embodiments in on catastrophe preparedness Catastrophe risk is characterized by low probability of occurrence and very high consequence given occurrence. Given this low probability, most non-life insurers tend to only deal with catastrophe occurrence every 15-30 years. As such, internal processes and standard operating procedures are, at best, out-of-date or irrelevant, and at worst, non-existent.
  • a CORRI is provided that includes a CORR framework, question set, and scoring rubric that will be described in further detail below.
  • An Index score is calculated by evaluating question responses with the scoring rubric, which has been based upon an industry-standard capability maturity model. Response scores are weighted, grouped, and summed according to the CORR framework in order to produce an indexed, benchmarkable score of a non-life insurer's readiness to respond operationally to a high- consequence catastrophe.
  • the framework provides, according to various embodiments, a defined scoring approach and rubric, enabling both 'self-scoring' by insurers and a 'certified scoring' that may be conducted by approved third party consultants. Over time, the quantitative weightings of criteria and attributes may be tested and refined as the underlying drivers of risk maturity continue to develop.
  • an entity may be provided economic incentive for near-term preparedness, by enabling catastrophe risk ecosystem stakeholders (governments, regulators, reinsurers) to reward better prepared non-life insurers through lower capital reserve requirements and/or lower reinsurance pricing.
  • catastrophe risk ecosystem stakeholders governments, regulators, reinsurers
  • the readiness rating may comparable across insurers and over time for a particular insurer.
  • the scoring approach is capable of being certified by a third party. Catastrophe risk ecosystem stakeholders, to the extent they can compare certified risk readiness ratings, are then free to tailor pricing, capital reserve requirements, or availability of reinsurance cover.
  • the CORR framework taxonomy shown in Figure 1, illustrates the structure of the framework criteria (specifically the distinction between Foundational and Enabling elements) according to an embodiment.
  • Figure 2 is a more detailed illustration of a framework taxonomy according to an embodiment.
  • the CORR taxonomy and framework indicate how the criteria weighting is established, as shown in Figure 3.
  • the criteria scoring rubric/guidance and matrix according to some embodiments, have been detailed according to an industry-standard maturity assessment model (similar in approach and convention used in EFQM, Baldrige, CMM, PCMM, and CMAT).
  • the CORR includes of a framework for evaluation.
  • the CORR includes of a framework for evaluation including 11 criteria, 25 sub- criteria, and 73 specific attributes, a list of detailed questions intended to drive at current performance against the framework, and a defined evaluation approach / scoring rubric.
  • the 11 criteria of the CORR Index of this example are listed in the below table:
  • each criteria there are a number of sub-criteria, and for each sub-criteria there are a number of specific attributes.
  • the sub-criteria and specific attributes for each criteria in this example are:
  • a framework provides the organization of the index score which includes the operational functions, weighting/importance of the functions, and the approach to score determination, including question, attribute, sub-criteria, criteria and overall scoring.
  • the question set comprises the foundation of the evaluation. A CORR user may respond to the question set in such a way that indicates current strategy and approach in enough detail to indicate evidence.
  • the question set according to various embodiments is designed to be adaptive, such that a user need only answer further detailed questions for which significant evidence is supplied in the preceding response, for example.
  • the question set may step through a logical flow from approach, through deployment, review/update and integration/alignment in order to enable effective scoring against the maturity-defined rubric.
  • a scoring rubric may be used as the rules and definitions to be used by the evaluator in determining a score of a particular question response.
  • the rubric is provided in order to enable standardization within a particular evaluation, and across multiple responses to the Index Assessment.
  • the framework, questions, and scoring rubric are all unique to the CORR Index of the present disclosure.
  • Each component has been developed with the purpose of enabling a standardized scoring approach for the determination of Catastrophe Operational Risk Readiness.
  • the maturity assessment foundation utilized in the scoring rubric is an industry- standard approach used in such applications as EFQM, Baldrige, CMM, PCMM and CMAT.
  • each CORR question corresponds to a particular component of the framework.
  • the framework may organize the questions into a logical arrangement of functional areas and attributes.
  • the scoring rubric is used to evaluate a user's responses in a standard manner.
  • the scores are weighted and summed according to the framework. More specifically, according to some embodiments, first an evaluator may provide the user(s) with the CORR Index Assessment question set, within the CORR framework. Second, the user(s) respond(s) to the questions according to the prescribed order and methodology. Third, the evaluator compiles the responses, verifies the information contained, and scores the responses according to the scoring rubric.
  • the evaluator uses the criteria/attribute weighting prescribed by the framework to produce component and overall scores.
  • an evaluator may adjust the criteria and attributes, and/or their relative weightings, based on insurer's business models and requirements and consumer demands.
  • an evaluator might also adjust the specific questions posed per criterion in order to generate the criteria/overall index scores, and an evaluator may be free to modify the assessment process (responses, scoring methodology, steps, etc.) to arrive at the index.
  • a block diagram illustration of a computing system that may be used in determining a CORR index may include any suitable device capable of operating to perform the functions described herein, and the particular components illustrated in FIG. 6 are for purposes of illustration and discussion of general concepts described herein.
  • a database 600 may be used to store information related to the framework that provides the organization of the index score, and may include the weighting/importance of functions, the scores for different criteria, question sets, attributes, sub-criteria, criteria and other information that will be readily apparent to one of skill in the art.
  • the database 600 may store the question set, which may be retrieved by a query module 610 of computer 602.
  • a user may view questions and respond to the question set using user interface module 615 that is coupled with a controller/processor 605.
  • a scoring module 620 may perform scoring for the responses to the question set based on a scoring rubric, sue has described above.
  • a weighting module 625 may provide weighting to one or more criteria/attributes according to weighting prescribed by the framework to produce component and overall scores.
  • the question set, associated responses, and scoring information may be stored back in database 600, or any other quitable memory device for later retrieval and/or analysis.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • FPGA field programmable gate array
  • a general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, multiple microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • the functions described herein may be implemented in hardware, software/firmware, or combinations thereof. If implemented in software/firmware, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Other examples and implementations are within the scope and spirit of the disclosure and appended claims. For example, due to the nature of software/firmware, functions described above can be implemented using software/firmware executed by, e.g., a processor, hardware, hardwiring, or combinations thereof. Features implementing functions may also be physically located at various positions, including being distributed such that portions of functions are implemented at different physical locations.
  • Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • a storage medium may be any available medium that can be accessed by a general- purpose or special-purpose computer.
  • computer- readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code means in the form of instructions or data structures and that can be accessed by a general-purpose or special- purpose computer, or a general-purpose or special-purpose processor.
  • any connection is properly termed a computer-readable medium.
  • Disk and disc include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu- ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope of computer- readable media.
  • embodiments of the inventive technology may involve a computerized framework for generating a catastrophe operational risk readiness index.
  • the framework (701) may utilize a risk management database (702) in which may be stored, for example, risk management data such as insurance preparedness foundation criteria, insurance preparedness enabler criteria, and insurer risk readiness question sets (705) (for example, for each criterion) or any other data relevant to the framework.
  • risk management data such as insurance preparedness foundation criteria, insurance preparedness enabler criteria, and insurer risk readiness question sets (705) (for example, for each criterion) or any other data relevant to the framework.
  • the insurance preparedness foundation criteria and insurance preparedness enabler criteria may establish, define, or otherwise create a standard to which the preparedness of an insurer may be evaluated, such as capability requirements for addressing catastrophic events.
  • the framework (701) may provide for each criterion to have one or more sub -criteria, and each sub-criterion to have one or more attributes.
  • a given criterion or sub-criterion may be implemented in the framework (701) via a criterion factor analytic module (703) or sub-criterion factor analytic module (704), for example in which modules may perform analysis of factors relevant to such criteria or sub-criteria.
  • a given attribute may be implemented in the framework (701) via an attribute activity integration module (706), for example in which modules may integrate activity relevant to such attributes into an analysis.
  • such criteria, sub-criteria, and attributes may correspond to one or more characteristics of an insurer.
  • a leadership and culture criterion may correspond to leadership and culture characteristics of an insurer
  • a people criterion may correspond to personnel characteristics of an insurer
  • a customer experience criterion may correspond to customer service aspects of an insurer, and so forth.
  • Some embodiments of the framework (701) may allow for customization of criteria, sub-criteria, and attributes, for example to address characteristics specific to given insurers, or to implement varied objectives of the framework (701) (for example, as such objectives may be programmed in a computerized framework (701)).
  • the insurer risk readiness question sets (705) may test, reflect, or otherwise relate to the readiness of an insurer to address risk, such as the readiness of the insurer to respond to catastrophic events.
  • Question sets may be correlated to address specific criteria, sub-criteria, or attributes, and may consist of one or more questions (714) that may be directed to an insurer to elicit responses (707).
  • questions (714) in various embodiments may be structured (or programmed, for example in a computerized framework (701)) to test for one or more given criteria, sub-criteria, or attributes.
  • questions may be adaptive, as described elsewhere herein, and embodiments may include one or more sets of automatic adaptive questions modules (713), that may utilize adaptive principles alone or in conjunction with question sets (705) or questions (714).
  • the framework (701) may have an insurer question interface (708) responsive to a risk management database (702) for posing one or more questions (714) to an insurer, and an insurer response interface (709) for receiving responses (707) to such posed questions from the insurer.
  • insurer question interfaces (708) may include speakers, displays, or the like
  • insurer response interfaces (709) may include microphones, keyboards, or the like, and such interfaces generally may naturally include anything suitable for posing questions and receiving responses from insurers.
  • Insurer responses (707) in various embodiments may be evaluated by an evaluative response scoring module (710) responsive to an insurer response interface (709).
  • This module may use any suitable methodology consistent with the objectives of the framework (701) (for example, as such objectives may be programmed in a computerized framework (701)) for evaluating the insurer response (707) in order to assign an evaluative response score to the response.
  • One example of such a methodology may be seen in Fig. 5, though naturally this example should not be construed to limit the inventive principles disclosed herein.
  • One or more evaluative response scores may then inform the assignment of an evaluative criterion score, such as where an evaluative criterion scoring module (711) may be responsive to an evaluative response scoring module (710).
  • this module may use any suitable methodology consistent with the objectives of the framework (701) (for example, as such objectives may be programmed in a computerized framework (701)) for evaluating one or more evaluative response scores in order to assign an evaluative criterion score for a criterion.
  • One example of such a methodology may be seen in Fig. 4, though naturally this example should not be construed to limit the inventive principles disclosed herein.
  • One or more evaluative criterion scores further may inform the assignment of a catastrophe operational risk readiness index for an insurer, such as where an evaluative index scoring module (712) may be responsive to an evaluative criterion scoring module (711).
  • this module may use any suitable methodology consistent with the objectives of the framework (701) (for example, as such objectives may be programmed in a computerized framework (701)) for evaluating one or more evaluative criterion scores in order to assign a catastrophe operational risk readiness index for an insurer.
  • One example of such a methodology may be seen in Fig. 3, though again this example should not be construed to limit the inventive principles disclosed herein.
  • a scoring module may be a computerized weighted scoring rubric module, as wherein a rubric for scoring may utilize weights for various components of the score. Any suitable weighting may be used consistent with the objectives of the framework (701) (for example, as may be programmed for a computerized framework (701)).
  • a computerized weighted scoring rubric module may utilize a differential weight for one of a criterion or criteria, a sub-criterion or sub-criteria, or an attribute or attributes.
  • an evaluative index scoring module (712) in some embodiments may utilize a scoring rubric that weights each foundation criterion at 20 percent and each enabler criterion at 5 percent, for example in assigning a catastrophe operational risk readiness index.
  • this example should not be construed to limit the inventive principles disclosed herein.
  • a scoring module may use any scoring rubric consistent with the inventive principles described herein. Examples may include an insurer logic rubric (as wherein a score may be based on consistency with an accepted logic), a question type rubric (as wherein a score may be based on consistency with accepted questions), a proficiency evidence rubric (as wherein a score may be based on evidence of proficiency), a summing rubric (as wherein a score may be based on summing component score elements), and averaging rubric (as wherein a score may be based on averaging component score elements), and the like, where the foregoing examples again should not be construed to limit the inventive principles disclosed herein.
  • the framework (701) in various embodiments also may include an output display (715) for displaying the catastrophe operational risk readiness index.
  • the display should be understood in general terms, so as to encompass anything that can communicate the catastrophe operational risk readiness index, such as monitors, speakers, printers, and the like.
  • the catastrophe operational risk readiness index may be a standardized catastrophe risk readiness index, as the principles of standardization may have been discussed elsewhere herein.
  • embodiments of the inventive technology may involve a computer architecture (801) for insurer response validity assessment in the determination of a catastrophe operation risk readiness index.
  • a risk management database (802) may store one or more substantive risk readiness questions (814) and one or more evidentiary validity assessment questions (815).
  • the substantive risk readiness questions (814) may test, reflect, or otherwise relate to the readiness of an insurer to address risk, such as the readiness of the insurer to respond to catastrophic events, and may be understood to be substantive in nature, as opposed to the evidentiary nature of the evidentiary validity assessment questions (815).
  • the substantive risk readiness questions (814) may include an approach question, a deployment question, a review question, and a linkage question.
  • insurer practices such as approach (e.g., does the insurer have an approach to handling catastrophic risk, and is that approach written), deployment (e.g., has the insurer's approach been communicated to those elements that handle catastrophic risk), review (e.g., does the insurer have a review process for reviewing other risk events that may bear on the approach), and linkage (e.g., has the approach been linked to all elements of the insurer, even those not directly responsible for handling catastrophic risk).
  • approach e.g., does the insurer have an approach to handling catastrophic risk, and is that approach written
  • deployment e.g., has the insurer's approach been communicated to those elements that handle catastrophic risk
  • review e.g., does the insurer have a review process for reviewing other risk events that may bear on the approach
  • linkage e.g., has the approach been linked to all elements of the insurer, even those not directly responsible for handling catastrophic risk.
  • this example is merely illustrative and should not be construed to limit the inventive principles disclosed herein as to the nature and function
  • the evidentiary validity assessment questions (815) may establish, define, or otherwise create a standard to which the validity of responses of an insurer to the substantive risk readiness questions (814) may be evaluated.
  • the evidentiary validity assessment questions (815) therefore may be correlated to the substantive risk readiness questions (814), and may evaluate the validity of insurer responses to the substantive risk readiness questions (814).
  • Such correlation may be in any manner suitable to effect evidentiary and/or validation functions they play with respect to the substantive questions they are correlated to.
  • the computer architecture may have an insurer question interface (808) and an insurer response interface (809) for posing substantive risk readiness questions (814) and evidentiary validity assessment questions (815) and receiving substantive risk readiness responses (816) and evidentiary validity assessment responses (817) from an insurer.
  • Insurer substantive risk readiness responses (816) in various embodiments may be evaluated by an evaluative substantive risk readiness response scoring module (818) responsive to an insurer response interface (809).
  • This module may use any suitable methodology consistent with the objectives of the architecture (801) (for example, as such objectives may be programmed in a computer architecture (801)) for evaluating the insurer substantive risk readiness responses (816) in order to assign a substantive risk readiness question score.
  • the substantive risk readiness score may reflect a measure of insurer risk readiness.
  • Evidentiary validity assessment responses (817) in various embodiments may be evaluated by an evaluative evidentiary validity assessment response scoring module (819) responsive to an insurer response interface (809).
  • This module may use any suitable methodology consistent with the objectives of the architecture (801) (for example, as such objectives may be programmed in a computer architecture (801)) for evaluating the insurer evidentiary validity assessment responses (817) in order to assign an evidentiary validity assessment score.
  • the evidentiary validity assessment score may reflect a measure of insurer response validity. Validity, in turn, may reach properties such as veracity, accuracy, reliability, and the like, perhaps providing a measure of soundness of the substantive risk readiness responses (816) and robustness to the catastrophe operational risk readiness index.
  • One or more substantive risk readiness question scores and one or more evidentiary validity assessment scores may be utilized to determine an insurer response validity assessment, such as where a validity assessment module (820) may utilize such substantive risk readiness question scores and such evidentiary validity assessment scores.
  • the insurer response validity assessment may be either a validation of one or more insurer responses or a disqualification of one or more insurer responses.
  • validation may involve the acceptance and subsequent use of the response in determining the index.
  • Disqualification may involve not using the disqualified insurer response in determining the index, such as perhaps by omitting the response from the determination, by requiring a new response from the insurer to replace the disqualified response, or the like.
  • this module may use any suitable methodology consistent with the objectives of the architecture (801) (for example, as such objectives may be programmed in a computer architecture (801)) to assess insurer response validity.
  • a validity assessment module (820) in some embodiments may be a threshold analysis validity assessment module, as wherein the assessment of insurer responses as being either valid or disqualified may depend on whether various scores fall within or outside of a threshold.
  • this example is merely illustrative and should not be construed to limit the inventive principles disclosed herein as to the nature and function of the validity assessment module (820).
  • an evaluative substantive risk readiness response scoring module (818), an evaluative evidentiary validity assessment response scoring module (819), and a validity assessment module (820) may each use any suitable methodologies for generating output consistent with the inventive principles described herein. As one example, embodiments may involve an evaluative substantive risk readiness response scoring module (818) utilizing a scoring rubric that equally weighs substantive insurer responses and takes a simple average of the responses to assign a substantive risk readiness score. Similarly, an evaluative evidentiary validity assessment response scoring module (819) may utilize a scoring rubric that equally weighs evidentiary insurer responses and take a simple average of the responses to assign an evidentiary validity assessment score.
  • a validity assessment module (820) then may utilize a threshold analysis where: 1) the substantive risk readiness score is utilized for the validity assessment if the evidentiary validity assessment score is equal to or differs by no more than 10 percent of the substantive risk readiness score; 2) the evidentiary validity assessment score is utilized for the validity assessment if the evidentiary validity assessment score differs from the substantive risk readiness score by an amount greater than 10 percent but no more than 20 percent of the substantive risk readiness score; and 3) the insurer responses are disqualified if the evidentiary validity assessment score differs from the substantive risk readiness score by more than 20 percent of the substantive risk readiness score.
  • the architecture (801) in various embodiments also may include an output display (812) for displaying the insurer response validity assessment.
  • the display should be understood in general terms, so as to encompass anything that can communicate the insurer response validity assessment, such as monitors, speakers, printers, and the like.
  • the basic concepts of the present inventive technology may be embodied in a variety of ways. It involves both risk readiness indexing techniques as well as devices to accomplish the appropriate risk readiness indexing.
  • the risk readiness indexing techniques are disclosed as part of the results shown to be achieved by the various devices described and as steps which are inherent to utilization. They are simply the natural result of utilizing the devices as intended and described.
  • some devices are disclosed, it should be understood that these not only accomplish certain methods but also can be varied in a number of ways.
  • all of these facets should be understood to be encompassed by this disclosure.
  • any claims set forth at any time are hereby incorporated by reference as part of this description of the inventive technology, and the applicant expressly reserves the right to use all of or a portion of such incorporated content of such claims as additional description to support any of or all of the claims or any element or component thereof, and the applicant further expressly reserves the right to move any portion of or all of the incorporated content of such claims or any element or component thereof from the description into the claims or vice-versa as necessary to define the matter for which protection is sought by this application or by any subsequent continuation, division, or continuation-in-part application thereof, or to obtain any benefit of, reduction in fees pursuant to, or to comply with the patent laws, rules, or regulations of any country or treaty, and such content incorporated by reference shall survive during the entire pendency of this application including any subsequent continuation, division, continuation-in-part application thereof or any reissue or extension thereon.

Abstract

Catastrophic operational risk readiness may be subject to computer indexing. A risk management database (702, 802) may store relevant criteria, sub-criteria, and attributes. Interfaces (708, 709, 808, 809) may pose questions (714, 814, 815) and receive responses (707, 816, 817). Relevant scores may be assigned by evaluative response scoring modules (710), evaluative criterion scoring modules (711), evaluative index scoring modules (712), evaluative substantive risk readiness response scoring modules (818), and evaluative evidentiary validity assessment response scoring modules (819). Validity assessment modules (820) may be utilized to assess response validity. A catastrophic operation risk readiness index may be generated.

Description

COMPUTERIZED INDEXING OF
CATASTROPHIC OPERATIONAL RISK READINESS
CROSS REFERENCE TO RELATED APPLICATION
This Application is an International PCT Patent Application claiming priority to and the benefit of U.S. Provisional Patent Application No. 61/816,025, filed April 5, 2013, hereby incorporated herein by reference in its entirety.
TECHNICAL FIELD
The present disclosure is directed to the field of evaluating and addressing catastrophe operational risk readiness, and more specifically, to systems and methods to define and quantify a non-life insurer's preparedness to deal with catastrophic events characterized by high consequence and low probability. BACKGROUND
Non-life insurers around the world are facing escalating costs of natural catastrophes. Furthermore, several areas exist in which non-life insurers appear to be systematically underprepared for catastrophic events, which may result in significant risks. The large percentage of realized operational risk in the Hurricane Katrina, Deepwater Horizon spill, and other catastrophes has led the Banking, Finance and Insurance Industry to recognize the importance of operational risk management processes. The present disclosure recognizes that by undertaking prudent pre-event steps, insurers may significantly reduce the costs of recovery in the event of a catastrophic event. Further, insurers and re-insurers have systematically underestimated the cost of catastrophe-driven operational risk, hence the relative merit of mitigating that risk has been understated.
DISCLOSURE OF THE INVENTION
In one embodiment, an object of the inventive technology is to provide a computerized framework for generating a catastrophe operational risk readiness index comprising: a risk management database for storing a plurality of insurance preparedness foundation criteria, a plurality of insurance preparedness enabler criteria, and an insurer risk readiness question set for each said criterion; an insurer question interface responsive to said risk management database for posing each question of each said insurer risk readiness question set to an insurer; an insurer response interface for receiving responses to each said posed question from said insurer; an evaluative response scoring module responsive to said insurer response interface for assigning an evaluative response score to each said insurer response; an evaluative criterion scoring module responsive to said evaluative response scoring module for assigning an evaluative criterion score to each said criterion; an evaluative index scoring module responsive to said evaluative criterion scoring module for assigning a catastrophe operational risk readiness index for said insurer; an output display for displaying said catastrophe operational risk readiness index.
In another embodiment, an object of the inventive technology is to provide a computer architecture for insurer response validity assessment in the determination of a catastrophe operational risk readiness index comprising: a risk management database for storing a plurality of substantive risk readiness questions and a plurality of evidentiary validity assessment questions correlated to said substantive risk readiness questions; an insurer question interface responsive to said risk management database for posing each said substantive risk readiness question and each said evidentiary validity assessment question to said insurer; an insurer response interface for receiving substantive risk readiness responses and evidentiary validity assessment responses from said insurer; an evaluative substantive risk readiness response scoring module responsive to said insurer response interface for assigning a substantive risk readiness question score to each said insurer substantive risk readiness response; an evaluative evidentiary validity assessment response scoring module responsive to said insurer response interface for assigning an evidentiary validity assessment score to each said insurer evidentiary validity assessment response; a validity assessment module for utilizing said substantive risk readiness question score and said evidentiary validity assessment score to assess insurer response validity; an output display for displaying said insurer response validity assessment.
In another embodiment, an object of the inventive technology is to provide a method for evaluating preparedness of an organization for a catastrophic event, comprising: providing a user with a catastrophe operational risk readiness (CORR) index assessment question set, the question set including a plurality of questions related to one or more criteria and/or attributes of the organization; receiving responses from the user to the question set; compiling and scoring the responses; and determining component and overall scores for CORR based on the response scores and predetermined weighting associated with the one or more criteria and/or attributes.
Naturally, further objects of the inventive technology will become apparent from the description and drawings below. BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 is a schematic representation depicting a conceptual framework for generating a catastrophe operation risk readiness index in one exemplary embodiment.
Fig. 2 is a schematic representation of foundational criteria and enabler criteria in one exemplary embodiment.
Fig. 3 is a schematic representation of an evaluative index scoring module methodology in one exemplary embodiment.
Fig. 4 is a schematic representation of an evaluative criterion scoring module methodology in one exemplary embodiment.
Fig. 5 is a schematic representation of an evaluative response scoring module methodology in one exemplary embodiment.
Fig. 6 is a schematic representation of a computer system for evaluating preparedness of an organization for a catastrophic event in one exemplary embodiment.
Fig. 7 is a schematic representation of a computerized framework for generating a catastrophe operational risk readiness index in one exemplary embodiment.
Fig. 8 is a schematic representation of a computer architecture for insurer response validity assessment in one exemplary embodiment.
MODES FOR CARRYING OUT THE INVENTION
The present inventive technology includes a variety of aspects, which may be combined in different ways. The following descriptions are provided to list elements and describe some of the embodiments of the present inventive technology. These elements are listed with initial embodiments, however it should be understood that they may be combined in any manner and in any number to create additional embodiments. The variously described examples and preferred embodiments should not be construed to limit the present inventive technology to only the explicitly described systems, techniques, and applications. Further, this description should be understood to support and encompass descriptions and claims of all the various embodiments, systems, techniques, methods, devices, and applications with any number of the disclosed elements, with each element alone, and also with any and all various permutations and combinations of all elements in this or any subsequent application.
The present disclosure provides a Catastrophe Operational Risk Readiness (CORR) Index, which is a systematic framework of criteria and attributes that identify strengths and weaknesses of Non-life insurers with respect to operational risk following a low probability, high consequence catastrophic event. The CORR Index provides a standard scoring approach that enables insurers, reinsurers, and regulators to evaluate risk readiness. By following the CORR Approach, an insurer can identify areas for improvement in preparation for a major catastrophe prior to the impact of the event itself.
Various aspects of the disclosure provide a framework and scoring approach to assess non-life insurers' degree of maturity in operational risk management (as distinct from insurance risk, etc.) in the context of low-probability / high consequence catastrophic events (as distinct from "business as usual"). According to embodiments, the framework includes the following:
(A) Identification of specific elements of Catastrophe Operational Risk Readiness ("Criteria") and a justification for their selection. Such elements may include, for example:
. Leadership and Culture
• Customer Experience
• People
• Measures and Analytics
• Claims Response
. Brand and Reputation
. Insight and Planning
• Sourcing and Partnering
• Technology and Systems
• Content Management
• Channel and Media
(B) Explanation of the interaction between Criteria- as some are foundational elements, influencing others, as an input to Criteria weighting and index development; (C) Decomposition and Definition of sub-elements ("Attributes") for each Criterion;
(D) Specification of 'scoring' approach/rubric for each Criterion and each underlying Attribute i.e., by what logic would an insurer score a given rating; what questions would one ask; what kinds of evidence would demonstrate proficiency;
(E) Determination of an overall index "score" based upon weightings and scores per attribute/Criterion; and
(F) Description of the evaluation process alternatives.
Utilizing such factors, the CORR Index may enable a non-life insurer to make meaningful improvement to its catastrophe operational risk readiness by highlighting areas of weakness and strength in a defensible, benchmarkable index.
The present disclosure recognizes that existing approaches/tools are deficient for a number of items, such as defining the elements of operational risk management in the context of low probability, high consequence catastrophic events- and explain which matter more and why such elements matter more, and quantifying an insurer's maturity in catastrophe operational risk management. Through systems and methods described herein, an entity may perform catastrophe operational risk readiness assessment and initiate an action plan to mitigate one or more identified risks.
Various aspects of the disclosure provide a CORRI that enables an entity to (1) focus on operational risk- versus other classes of insurance risk; (2) focus on catastrophe preparedness in particular; and (3) Index composition and scoring mechanics. As noted above, operational risk may be a primary focus, and is one of a number of overall risks present in non- life insurers. Such overall risks may be categorized into classes of risk, including: Insurance Risk; Market Risk; Credit Risk; Liquidity Risk; Strategic Risk; Operational Risk. Within Operational Risk, non-life insurers are exposed to a number of risks in responding to low probability, high consequence events, such as Claims Management; Information Management; Stakeholder Engagement; and Supply Chain Viability, to name a few examples. Much of the operational risk management attention is directed at assessing business continuity management (i.e., ensuring the insurer could operate in the event of some catastrophe) whereas the focus of the present disclosure is on the underwritten risks themselves and how the insurer improves operational readiness and ultimately operational performance. As noted above, a focus of various embodiments in on catastrophe preparedness. Catastrophe risk is characterized by low probability of occurrence and very high consequence given occurrence. Given this low probability, most non-life insurers tend to only deal with catastrophe occurrence every 15-30 years. As such, internal processes and standard operating procedures are, at best, out-of-date or irrelevant, and at worst, non-existent.
In one embodiment, a CORRI is provided that includes a CORR framework, question set, and scoring rubric that will be described in further detail below. An Index score is calculated by evaluating question responses with the scoring rubric, which has been based upon an industry-standard capability maturity model. Response scores are weighted, grouped, and summed according to the CORR framework in order to produce an indexed, benchmarkable score of a non-life insurer's readiness to respond operationally to a high- consequence catastrophe. The framework provides, according to various embodiments, a defined scoring approach and rubric, enabling both 'self-scoring' by insurers and a 'certified scoring' that may be conducted by approved third party consultants. Over time, the quantitative weightings of criteria and attributes may be tested and refined as the underlying drivers of risk maturity continue to develop.
Utilizing systems and methods provided herein, an entity may be provided economic incentive for near-term preparedness, by enabling catastrophe risk ecosystem stakeholders (governments, regulators, reinsurers) to reward better prepared non-life insurers through lower capital reserve requirements and/or lower reinsurance pricing. Furthermore, by standardizing the criteria and attributes of catastrophe preparedness, and by standardizing a scoring approach, the readiness rating may comparable across insurers and over time for a particular insurer. Additionally, the scoring approach is capable of being certified by a third party. Catastrophe risk ecosystem stakeholders, to the extent they can compare certified risk readiness ratings, are then free to tailor pricing, capital reserve requirements, or availability of reinsurance cover. If a non-life insurer wished to conduct an evaluation of its catastrophic operational readiness without using a defined index, they could perform a customized, in- house evaluation. Such an approach, however, sacrifices certification and comparability with others, and hence has less communication power with reinsurers and regulators. Finally, by focusing on the operational risks associated with low frequency, high consequence catastrophes, a situation not frequently evaluated by the non-life insurance industry, the CORRI enables pre-event managerial decisions regarding systems and process improvements in order to mitigate downstream operational risk.
With reference now to Figures 1-3, an embodiment of a catastrophe risk readiness index system is described. In the figures, the CORR framework taxonomy, shown in Figure 1, illustrates the structure of the framework criteria (specifically the distinction between Foundational and Enabling elements) according to an embodiment. Figure 2 is a more detailed illustration of a framework taxonomy according to an embodiment. Finally, the CORR taxonomy and framework indicate how the criteria weighting is established, as shown in Figure 3. In addition, the criteria scoring rubric/guidance and matrix, according to some embodiments, have been detailed according to an industry-standard maturity assessment model (similar in approach and convention used in EFQM, Baldrige, CMM, PCMM, and CMAT). Figures 4 and 5, below, demonstrate the index score matrix and scoring guidance that may be used by an assessment evaluator
In some embodiments, the CORR includes of a framework for evaluation. In one example, the CORR includes of a framework for evaluation including 11 criteria, 25 sub- criteria, and 73 specific attributes, a list of detailed questions intended to drive at current performance against the framework, and a defined evaluation approach / scoring rubric. The 11 criteria of the CORR Index of this example are listed in the below table:
Criteria
Leadership and Culture
Customer Experience
People
Measures and Analytics
Claims Response
Brand and Reputation
Insight and Planning
Sourcing and Partnering
Technology and Systems
Content Management
Figure imgf000009_0001
As noted above, for each criteria, in this example, there are a number of sub-criteria, and for each sub-criteria there are a number of specific attributes. The sub-criteria and specific attributes for each criteria in this example are:
Figure imgf000009_0002
Figure imgf000009_0003
Figure imgf000009_0004
Capacity
People Environment
Physical Environment
Policy Environment
Figure imgf000010_0001
Figure imgf000010_0002
Brand and Reputation
Sub-Criteria Specific Attributes
Reputational Risk Identification and
Consensus
Reputational Risk Identification
Reputational Risk Consensus
Reputational Risk Stakeholder Involvement
Reputational Risk Strategy
Reputational Risk Strategy Development Process
Reputation Strategic Objectives
Reputational Risk Management
Reputational Risk Management Framework
Reputation Management Governance Insight and Planning
Sub-Criteria Specific Attributes
Insight
Insight Adoption and Categorization
Insight Utilization Process
Planning
Planning Process
Statement of CORR Objectives
Deployment and Alignment
Measurement
Figure imgf000011_0001
Figure imgf000011_0002
Figure imgf000011_0003
Content Definition
Content Access and Usability
Content Availability
Open and Distributable
Customizable
Measurable
Security
Technology
Content Management Lifecycle Process
Process
Content Retrieval
Content Categorization
Content Distribution
Figure imgf000012_0001
In some embodiments, a framework provides the organization of the index score which includes the operational functions, weighting/importance of the functions, and the approach to score determination, including question, attribute, sub-criteria, criteria and overall scoring. The question set, in some embodiments, comprises the foundation of the evaluation. A CORR user may respond to the question set in such a way that indicates current strategy and approach in enough detail to indicate evidence. The question set according to various embodiments is designed to be adaptive, such that a user need only answer further detailed questions for which significant evidence is supplied in the preceding response, for example. In addition, the question set may step through a logical flow from approach, through deployment, review/update and integration/alignment in order to enable effective scoring against the maturity-defined rubric. Finally, a scoring rubric may be used as the rules and definitions to be used by the evaluator in determining a score of a particular question response. The rubric is provided in order to enable standardization within a particular evaluation, and across multiple responses to the Index Assessment. The framework, questions, and scoring rubric are all unique to the CORR Index of the present disclosure. Each component has been developed with the purpose of enabling a standardized scoring approach for the determination of Catastrophe Operational Risk Readiness. In some aspects, the maturity assessment foundation utilized in the scoring rubric is an industry- standard approach used in such applications as EFQM, Baldrige, CMM, PCMM and CMAT.
When performing the assessment, each CORR question corresponds to a particular component of the framework. As such, the framework may organize the questions into a logical arrangement of functional areas and attributes. The scoring rubric is used to evaluate a user's responses in a standard manner. Finally, the scores, as indicated by the scoring rubric, are weighted and summed according to the framework. More specifically, according to some embodiments, first an evaluator may provide the user(s) with the CORR Index Assessment question set, within the CORR framework. Second, the user(s) respond(s) to the questions according to the prescribed order and methodology. Third, the evaluator compiles the responses, verifies the information contained, and scores the responses according to the scoring rubric. Finally, the evaluator uses the criteria/attribute weighting prescribed by the framework to produce component and overall scores. According to some embodiments, an evaluator may adjust the criteria and attributes, and/or their relative weightings, based on insurer's business models and requirements and consumer demands. In other embodiments, an evaluator might also adjust the specific questions posed per criterion in order to generate the criteria/overall index scores, and an evaluator may be free to modify the assessment process (responses, scoring methodology, steps, etc.) to arrive at the index.
With reference now to FIG. 6, a block diagram illustration of a computing system that may be used in determining a CORR index according to various embodiments. It will be readily understood that a system such as illustrated in FIG. 6 may include any suitable device capable of operating to perform the functions described herein, and the particular components illustrated in FIG. 6 are for purposes of illustration and discussion of general concepts described herein. In this embodiment, a database 600 may be used to store information related to the framework that provides the organization of the index score, and may include the weighting/importance of functions, the scores for different criteria, question sets, attributes, sub-criteria, criteria and other information that will be readily apparent to one of skill in the art. For example the database 600 may store the question set, which may be retrieved by a query module 610 of computer 602. A user may view questions and respond to the question set using user interface module 615 that is coupled with a controller/processor 605. A scoring module 620 may perform scoring for the responses to the question set based on a scoring rubric, sue has described above. A weighting module 625 may provide weighting to one or more criteria/attributes according to weighting prescribed by the framework to produce component and overall scores. The question set, associated responses, and scoring information may be stored back in database 600, or any other quitable memory device for later retrieval and/or analysis.
The detailed description set forth above in connection with the appended drawings describes exemplary embodiments and does not represent the only embodiments that may be implemented or that are within the scope of the claims. The term "exemplary" used throughout this description means "serving as an example, instance, or illustration," and not "preferred" or "advantageous over other embodiments." The detailed description includes specific details for the purpose of providing an understanding of the described techniques. These techniques, however, may be practiced without these specific details. In some instances, well-known structures and devices are shown in block diagram form in order to avoid obscuring the concepts of the described embodiments.
The various illustrative examples modules described in connection with the disclosure herein may be implemented or performed with a general-purpose processor, a digital signal processor (DSP), an application- specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, multiple microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The functions described herein may be implemented in hardware, software/firmware, or combinations thereof. If implemented in software/firmware, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Other examples and implementations are within the scope and spirit of the disclosure and appended claims. For example, due to the nature of software/firmware, functions described above can be implemented using software/firmware executed by, e.g., a processor, hardware, hardwiring, or combinations thereof. Features implementing functions may also be physically located at various positions, including being distributed such that portions of functions are implemented at different physical locations. Also, as used herein, including in the claims, "or" as used in a list of items prefaced by "at least one of indicates a disjunctive list such that, for example, a list of "at least one of A, B, or C" means A or B or C or AB or AC or BC or ABC (i.e., A and Band C).
Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage medium may be any available medium that can be accessed by a general- purpose or special-purpose computer. By way of example, and not limitation, computer- readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code means in the form of instructions or data structures and that can be accessed by a general-purpose or special- purpose computer, or a general-purpose or special-purpose processor. Also, any connection is properly termed a computer-readable medium. For example, if the software/firmware is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu- ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope of computer- readable media.
The previous description of the disclosure is provided to enable a person skilled in the art to make or use the disclosure. Various modifications to the disclosure will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other variations without departing from the spirit or scope of the disclosure. Throughout this disclosure the term "example" or "exemplary" indicates an example or instance and does not imply or require any preference for the noted example. Thus, the disclosure is not to be imted to the examples and designs described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Now with reference primarily to Fig. 7, embodiments of the inventive technology may involve a computerized framework for generating a catastrophe operational risk readiness index.
The framework (701) may utilize a risk management database (702) in which may be stored, for example, risk management data such as insurance preparedness foundation criteria, insurance preparedness enabler criteria, and insurer risk readiness question sets (705) (for example, for each criterion) or any other data relevant to the framework.
The insurance preparedness foundation criteria and insurance preparedness enabler criteria may establish, define, or otherwise create a standard to which the preparedness of an insurer may be evaluated, such as capability requirements for addressing catastrophic events. Moreover, the framework (701) may provide for each criterion to have one or more sub -criteria, and each sub-criterion to have one or more attributes.
A given criterion or sub-criterion may be implemented in the framework (701) via a criterion factor analytic module (703) or sub-criterion factor analytic module (704), for example in which modules may perform analysis of factors relevant to such criteria or sub-criteria. Moreover, a given attribute may be implemented in the framework (701) via an attribute activity integration module (706), for example in which modules may integrate activity relevant to such attributes into an analysis. Naturally, all criteria, sub-criteria, and attributes as are described herein may be implemented by such modules.
In various embodiments, such criteria, sub-criteria, and attributes may correspond to one or more characteristics of an insurer. For example, a leadership and culture criterion may correspond to leadership and culture characteristics of an insurer, a people criterion may correspond to personnel characteristics of an insurer, a customer experience criterion may correspond to customer service aspects of an insurer, and so forth. Some embodiments of the framework (701) may allow for customization of criteria, sub-criteria, and attributes, for example to address characteristics specific to given insurers, or to implement varied objectives of the framework (701) (for example, as such objectives may be programmed in a computerized framework (701)).
The insurer risk readiness question sets (705) may test, reflect, or otherwise relate to the readiness of an insurer to address risk, such as the readiness of the insurer to respond to catastrophic events. Question sets may be correlated to address specific criteria, sub-criteria, or attributes, and may consist of one or more questions (714) that may be directed to an insurer to elicit responses (707). For example, questions (714) in various embodiments may be structured (or programmed, for example in a computerized framework (701)) to test for one or more given criteria, sub-criteria, or attributes. Moreover, questions may be adaptive, as described elsewhere herein, and embodiments may include one or more sets of automatic adaptive questions modules (713), that may utilize adaptive principles alone or in conjunction with question sets (705) or questions (714).
The framework (701) may have an insurer question interface (708) responsive to a risk management database (702) for posing one or more questions (714) to an insurer, and an insurer response interface (709) for receiving responses (707) to such posed questions from the insurer. Examples of insurer question interfaces (708) may include speakers, displays, or the like, and examples of insurer response interfaces (709) may include microphones, keyboards, or the like, and such interfaces generally may naturally include anything suitable for posing questions and receiving responses from insurers.
Insurer responses (707) in various embodiments may be evaluated by an evaluative response scoring module (710) responsive to an insurer response interface (709). This module may use any suitable methodology consistent with the objectives of the framework (701) (for example, as such objectives may be programmed in a computerized framework (701)) for evaluating the insurer response (707) in order to assign an evaluative response score to the response. One example of such a methodology may be seen in Fig. 5, though naturally this example should not be construed to limit the inventive principles disclosed herein.
One or more evaluative response scores may then inform the assignment of an evaluative criterion score, such as where an evaluative criterion scoring module (711) may be responsive to an evaluative response scoring module (710). Again, this module may use any suitable methodology consistent with the objectives of the framework (701) (for example, as such objectives may be programmed in a computerized framework (701)) for evaluating one or more evaluative response scores in order to assign an evaluative criterion score for a criterion. One example of such a methodology may be seen in Fig. 4, though naturally this example should not be construed to limit the inventive principles disclosed herein.
One or more evaluative criterion scores further may inform the assignment of a catastrophe operational risk readiness index for an insurer, such as where an evaluative index scoring module (712) may be responsive to an evaluative criterion scoring module (711). Once more, this module may use any suitable methodology consistent with the objectives of the framework (701) (for example, as such objectives may be programmed in a computerized framework (701)) for evaluating one or more evaluative criterion scores in order to assign a catastrophe operational risk readiness index for an insurer. One example of such a methodology may be seen in Fig. 3, though again this example should not be construed to limit the inventive principles disclosed herein.
In some embodiments, a scoring module may be a computerized weighted scoring rubric module, as wherein a rubric for scoring may utilize weights for various components of the score. Any suitable weighting may be used consistent with the objectives of the framework (701) (for example, as may be programmed for a computerized framework (701)). For example, in various embodiments a computerized weighted scoring rubric module may utilize a differential weight for one of a criterion or criteria, a sub-criterion or sub-criteria, or an attribute or attributes. As but one example, an evaluative index scoring module (712) in some embodiments may utilize a scoring rubric that weights each foundation criterion at 20 percent and each enabler criterion at 5 percent, for example in assigning a catastrophe operational risk readiness index. However, this example should not be construed to limit the inventive principles disclosed herein.
Of course, a scoring module may use any scoring rubric consistent with the inventive principles described herein. Examples may include an insurer logic rubric (as wherein a score may be based on consistency with an accepted logic), a question type rubric (as wherein a score may be based on consistency with accepted questions), a proficiency evidence rubric (as wherein a score may be based on evidence of proficiency), a summing rubric (as wherein a score may be based on summing component score elements), and averaging rubric (as wherein a score may be based on averaging component score elements), and the like, where the foregoing examples again should not be construed to limit the inventive principles disclosed herein.
The framework (701) in various embodiments also may include an output display (715) for displaying the catastrophe operational risk readiness index. The display should be understood in general terms, so as to encompass anything that can communicate the catastrophe operational risk readiness index, such as monitors, speakers, printers, and the like. Moreover, in various embodiments the catastrophe operational risk readiness index may be a standardized catastrophe risk readiness index, as the principles of standardization may have been discussed elsewhere herein.
Now with attention primarily to Fig 8., embodiments of the inventive technology may involve a computer architecture (801) for insurer response validity assessment in the determination of a catastrophe operation risk readiness index.
A risk management database (802) may store one or more substantive risk readiness questions (814) and one or more evidentiary validity assessment questions (815).
The substantive risk readiness questions (814) may test, reflect, or otherwise relate to the readiness of an insurer to address risk, such as the readiness of the insurer to respond to catastrophic events, and may be understood to be substantive in nature, as opposed to the evidentiary nature of the evidentiary validity assessment questions (815). In some embodiments, for example, the substantive risk readiness questions (814) may include an approach question, a deployment question, a review question, and a linkage question. These may reach substantive aspects of insurer practices, such as approach (e.g., does the insurer have an approach to handling catastrophic risk, and is that approach written), deployment (e.g., has the insurer's approach been communicated to those elements that handle catastrophic risk), review (e.g., does the insurer have a review process for reviewing other risk events that may bear on the approach), and linkage (e.g., has the approach been linked to all elements of the insurer, even those not directly responsible for handling catastrophic risk). Of course, this example is merely illustrative and should not be construed to limit the inventive principles disclosed herein as to the nature and function of substantive risk readiness questions (814). The evidentiary validity assessment questions (815) may establish, define, or otherwise create a standard to which the validity of responses of an insurer to the substantive risk readiness questions (814) may be evaluated. The evidentiary validity assessment questions (815) therefore may be correlated to the substantive risk readiness questions (814), and may evaluate the validity of insurer responses to the substantive risk readiness questions (814). Such correlation may be in any manner suitable to effect evidentiary and/or validation functions they play with respect to the substantive questions they are correlated to.
Naturally, the computer architecture may have an insurer question interface (808) and an insurer response interface (809) for posing substantive risk readiness questions (814) and evidentiary validity assessment questions (815) and receiving substantive risk readiness responses (816) and evidentiary validity assessment responses (817) from an insurer.
Insurer substantive risk readiness responses (816) in various embodiments may be evaluated by an evaluative substantive risk readiness response scoring module (818) responsive to an insurer response interface (809). This module may use any suitable methodology consistent with the objectives of the architecture (801) (for example, as such objectives may be programmed in a computer architecture (801)) for evaluating the insurer substantive risk readiness responses (816) in order to assign a substantive risk readiness question score. The substantive risk readiness score may reflect a measure of insurer risk readiness.
Evidentiary validity assessment responses (817) in various embodiments may be evaluated by an evaluative evidentiary validity assessment response scoring module (819) responsive to an insurer response interface (809). This module may use any suitable methodology consistent with the objectives of the architecture (801) (for example, as such objectives may be programmed in a computer architecture (801)) for evaluating the insurer evidentiary validity assessment responses (817) in order to assign an evidentiary validity assessment score. The evidentiary validity assessment score may reflect a measure of insurer response validity. Validity, in turn, may reach properties such as veracity, accuracy, reliability, and the like, perhaps providing a measure of soundness of the substantive risk readiness responses (816) and robustness to the catastrophe operational risk readiness index. One or more substantive risk readiness question scores and one or more evidentiary validity assessment scores may be utilized to determine an insurer response validity assessment, such as where a validity assessment module (820) may utilize such substantive risk readiness question scores and such evidentiary validity assessment scores. In some embodiments, the insurer response validity assessment may be either a validation of one or more insurer responses or a disqualification of one or more insurer responses. In the context of determining a catastrophe operational risk readiness index, validation may involve the acceptance and subsequent use of the response in determining the index. Disqualification may involve not using the disqualified insurer response in determining the index, such as perhaps by omitting the response from the determination, by requiring a new response from the insurer to replace the disqualified response, or the like.
Once more, this module may use any suitable methodology consistent with the objectives of the architecture (801) (for example, as such objectives may be programmed in a computer architecture (801)) to assess insurer response validity. As one example, a validity assessment module (820) in some embodiments may be a threshold analysis validity assessment module, as wherein the assessment of insurer responses as being either valid or disqualified may depend on whether various scores fall within or outside of a threshold. Of course, this example is merely illustrative and should not be construed to limit the inventive principles disclosed herein as to the nature and function of the validity assessment module (820). As noted, an evaluative substantive risk readiness response scoring module (818), an evaluative evidentiary validity assessment response scoring module (819), and a validity assessment module (820) may each use any suitable methodologies for generating output consistent with the inventive principles described herein. As one example, embodiments may involve an evaluative substantive risk readiness response scoring module (818) utilizing a scoring rubric that equally weighs substantive insurer responses and takes a simple average of the responses to assign a substantive risk readiness score. Similarly, an evaluative evidentiary validity assessment response scoring module (819) may utilize a scoring rubric that equally weighs evidentiary insurer responses and take a simple average of the responses to assign an evidentiary validity assessment score. A validity assessment module (820) then may utilize a threshold analysis where: 1) the substantive risk readiness score is utilized for the validity assessment if the evidentiary validity assessment score is equal to or differs by no more than 10 percent of the substantive risk readiness score; 2) the evidentiary validity assessment score is utilized for the validity assessment if the evidentiary validity assessment score differs from the substantive risk readiness score by an amount greater than 10 percent but no more than 20 percent of the substantive risk readiness score; and 3) the insurer responses are disqualified if the evidentiary validity assessment score differs from the substantive risk readiness score by more than 20 percent of the substantive risk readiness score.
The architecture (801) in various embodiments also may include an output display (812) for displaying the insurer response validity assessment. The display should be understood in general terms, so as to encompass anything that can communicate the insurer response validity assessment, such as monitors, speakers, printers, and the like.
As can be easily understood from the foregoing, the basic concepts of the present inventive technology may be embodied in a variety of ways. It involves both risk readiness indexing techniques as well as devices to accomplish the appropriate risk readiness indexing. In this application, the risk readiness indexing techniques are disclosed as part of the results shown to be achieved by the various devices described and as steps which are inherent to utilization. They are simply the natural result of utilizing the devices as intended and described. In addition, while some devices are disclosed, it should be understood that these not only accomplish certain methods but also can be varied in a number of ways. Importantly, as to all of the foregoing, all of these facets should be understood to be encompassed by this disclosure.
The discussion included in this patent application is intended to serve as a basic description. The reader should be aware that the specific discussion may not explicitly describe all embodiments possible; many alternatives are implicit. It also may not fully explain the generic nature of the inventive technology and may not explicitly show how each feature or element can actually be representative of a broader function or of a great variety of alternative or equivalent elements. Again, these are implicitly included in this disclosure. Where the inventive technology is described in device-oriented terminology, each element of the device implicitly performs a function. Apparatus claims may not only be included for the device described, but also method or process claims may be included to address the functions the inventive technology and each element performs. Neither the description nor the terminology is intended to limit the scope of the claims that will be included in any subsequent patent application.
It should also be understood that a variety of changes may be made without departing from the essence of the inventive technology. Such changes are also implicitly included in the description. They still fall within the scope of this inventive technology. A broad disclosure encompassing the explicit embodiment(s) shown, the great variety of implicit alternative embodiments, and the broad methods or processes and the like are encompassed by this disclosure and may be relied upon when drafting the claims for any subsequent patent application. It should be understood that such language changes and broader or more detailed claiming may be accomplished at a later date (such as by any required deadline) or in the event the applicant subsequently seeks a patent filing based on this filing. With this understanding, the reader should be aware that this disclosure is to be understood to support any subsequently filed patent application that may seek examination of as broad a base of claims as deemed within the applicant's right and may be designed to yield a patent covering numerous aspects of the inventive technology both independently and as an overall system. Further, each of the various elements of the inventive technology and claims may also be achieved in a variety of manners. Additionally, when used or implied, an element is to be understood as encompassing individual as well as plural structures that may or may not be physically connected. This disclosure should be understood to encompass each such variation, be it a variation of an embodiment of any apparatus embodiment, a method or process embodiment, or even merely a variation of any element of these. Particularly, it should be understood that as the disclosure relates to elements of the inventive technology, the words for each element may be expressed by equivalent apparatus terms or method terms— even if only the function or result is the same. Such equivalent, broader, or even more generic terms should be considered to be encompassed in the description of each element or action. Such terms can be substituted where desired to make explicit the implicitly broad coverage to which this inventive technology is entitled. As but one example, it should be understood that all actions may be expressed as a means for taking that action or as an element which causes that action. Similarly, each physical element disclosed should be understood to encompass a disclosure of the action which that physical element facilitates. Regarding this last aspect, as but one example, the disclosure of an "indexer" should be understood to encompass disclosure of the act of "indexing"— whether explicitly discussed or not— and, conversely, were there effectively disclosure of the act of "indexing", such a disclosure should be understood to encompass disclosure of an "indexer" and even a "means for indexing." Such changes and alternative terms are to be understood to be explicitly included in the description. Further, each such means (whether explicitly so described or not) should be understood as encompassing all elements that can perform the given function, and all descriptions of elements that perform a described function should be understood as a non- limiting example of means for performing that function.
Any patents, publications, or other references mentioned in this application for patent are hereby incorporated by reference. Any priority case(s) claimed by this application is hereby appended and hereby incorporated by reference. In addition, as to each term used it should be understood that unless its utilization in this application is inconsistent with a broadly supporting interpretation, common dictionary definitions should be understood as incorporated for each term and all definitions, alternative terms, and synonyms such as contained in the Random House Webster's Unabridged Dictionary, second edition are hereby incorporated by reference. Finally, all references listed below and all references listed in the list of References To Be Incorporated By Reference In Accordance With The Provisional Patent Application or other information statement filed with the application are hereby appended and hereby incorporated by reference, however, as to each of the above, to the extent that such information or statements incorporated by reference might be considered inconsistent with the patenting of this/these inventive technology(s) such statements are expressly not to be considered as made by the applicant(s).
I. U.S. PATENT DOCUMENTS
Figure imgf000024_0001
Patent Number Kind Code Issue Date Name of Patentee or Applicant of cited Document
7571171 Bl 2009-08-04 Shaw
7184983 B2 2007-02-27 Corby, et al.
7627491 B2 2009-12-01 Feyen, et al.
7558757 B2 2009-07-07 Conroy, et al.
7747518 B2 2010-06-29 Caballero, et al.
7644028 B2 2010-01-05 Waddel, et al.
7725375 B2 2010-05-25 Shepherd
7822676 B2 2010-10-26 Shepherd
7711634 B2 2010-05-04 Klugman
7840423 B2 2010-11-23 Sato
8050996 B2 2011-11-01 Shepherd
8229769 Bl 2012-07-24 Hopkins, II I,
7979336 B2 2011-07-12 Weber, et al.
8229768 Bl 2012-07-24 Hopkins, II I
8280804 B2 2012-10-02 Iyer, et al.
8301560 B2 2012-10-30 Winslow, et al.
7430534 B2 2008-09-30 Lef, et al.
7627491 B2 2009-12-01 Feyen, et al.
7567914 B2 2009-07-28 Bonissone, et al.
6418417 Bl 2002-07-09 Corby, et al.
8595036 B2 2013-11-26 Jones, et al.
8468037 Bl 2013-06-18 Clarke, et al.
8090600 B2 2012-01-03 Zlade, et al.
8666786 Bl 2014-03-04 Wirz, et al.
7707050 B2 2010-04-27 Chen, et al.
8548833 B2 2013-10-01 Jones, et al.
7881951 B2 2011-02-01 Roschelle, et al.
8554588 B2 2013-10-08 Jones, et al.
7899688 B2 2011-03-01 Bonissone, et al.
7783505 B2 2010-08-24 Roschelle, et al.
8515783 Bl 2013-02-20 Weeks
5873066 1999-02-16 Underwood, et al. Thus, the applicant(s) should be understood to have support to claim and make a statement of invention to at least: i) each of the risk readiness indexing devices as herein disclosed and described, ii) the related methods disclosed and described, iii) similar, equivalent, and even implicit variations of each of these devices and methods, iv) those alternative designs which accomplish each of the functions shown as are disclosed and described, v) those alternative designs and methods which accomplish each of the functions shown as are implicit to accomplish that which is disclosed and described, vi) each feature, component, and step shown as separate and independent inventions, vii) the applications enhanced by the various systems or components disclosed, viii) the resulting products produced by such systems or components, ix) each system, method, and element shown or described as now applied to any specific field or devices mentioned, x) methods and apparatuses substantially as described hereinbefore and with reference to any of the accompanying examples, xi) an apparatus for performing the methods described herein comprising means for performing the steps, xii) the various combinations and permutations of each of the elements disclosed, xiii) each potentially dependent claim or concept as a dependency on each and every one of the independent claims or concepts presented, and xiv) all inventions described herein.
In addition and as to computer aspects and each aspect amenable to programming or other electronic automation, the applicant(s) should be understood to have support to claim and make a statement of invention to at least: xv) processes performed with the aid of or on a computer as described throughout the above discussion, xvi) a programmable apparatus as described throughout the above discussion, xvii) a computer readable memory encoded with data to direct a computer comprising means or elements which function as described throughout the above discussion, xviii) a computer configured as herein disclosed and described, xix) individual or combined subroutines and programs as herein disclosed and described, xx) a carrier medium carrying computer readable code for control of a computer to carry out separately each and every individual and combined method described herein or in any claim, xxi) a computer program to perform separately each and every individual and combined method disclosed, xxii) a computer program containing all and each combination of means for performing each and every individual and combined step disclosed, xxiii) a storage medium storing each computer program disclosed, xxiv) a signal carrying a computer program disclosed, xxv) the related methods disclosed and described, xxvi) similar, equivalent, and even implicit variations of each of these systems and methods, xxvii) those alternative designs which accomplish each of the functions shown as are disclosed and described, xxviii) those alternative designs and methods which accomplish each of the functions shown as are implicit to accomplish that which is disclosed and described, xxix) each feature, component, and step shown as separate and independent inventions, and xxx) the various combinations and permutations of each of the above.
With regard to claims whether now or later presented for examination, it should be understood that for practical reasons and so as to avoid great expansion of the examination burden, the applicant may at any time present only initial claims or perhaps only initial claims with only initial dependencies. The office and any third persons interested in potential scope of this or subsequent applications should understand that broader claims may be presented at a later date in this case, in a case claiming the benefit of this case, or in any continuation in spite of any preliminary amendments, other amendments, claim language, or arguments presented, thus throughout the pendency of any case there is no intention to disclaim or surrender any potential subject matter. It should be understood that if or when broader claims are presented, such may require that any relevant prior art that may have been considered at any prior time may need to be re-visited since it is possible that to the extent any amendments, claim language, or arguments presented in this or any subsequent application are considered as made to avoid such prior art, such reasons may be eliminated by later presented claims or the like. Both the examiner and any person otherwise interested in existing or later potential coverage, or considering if there has at any time been any possibility of an indication of disclaimer or surrender of potential coverage, should be aware that no such surrender or disclaimer is ever intended or ever exists in this or any subsequent application. Limitations such as arose in Hakim v. Cannon Avent Group, PLC, 479 F.3d 1313 (Fed. Cir 2007), or the like are expressly not intended in this or any subsequent related matter. In addition, support should be understood to exist to the degree required under new matter laws— including but not limited to European Patent Convention Article 123(2) and United States Patent Law 35 USC 132 or other such laws— to permit the addition of any of the various dependencies or other elements presented under one independent claim or concept as dependencies or elements under any other independent claim or concept. In drafting any claims at any time whether in this application or in any subsequent application, it should also be understood that the applicant has intended to capture as full and broad a scope of coverage as legally available. To the extent that insubstantial substitutes are made, to the extent that the applicant did not in fact draft any claim so as to literally encompass any particular embodiment, and to the extent otherwise applicable, the applicant should not be understood to have in any way intended to or actually relinquished such coverage as the applicant simply may not have been able to anticipate all eventualities; one skilled in the art, should not be reasonably expected to have drafted a claim that would have literally encompassed such alternative embodiments.
Further, if or when used, the use of the transitional phrase "comprising" is used to maintain the "open-end" claims herein, according to traditional claim interpretation. Thus, unless the context requires otherwise, it should be understood that the term "comprise" or variations such as "comprises" or "comprising", are intended to imply the inclusion of a stated element or step or group of elements or steps but not the exclusion of any other element or step or group of elements or steps. Such terms should be interpreted in their most expansive form so as to afford the applicant the broadest coverage legally permissible. The use of the phrase, "or any other claim" is used to provide support for any claim to be dependent on any other claim, such as another dependent claim, another independent claim, a previously listed claim, a subsequently listed claim, and the like. As one clarifying example, if a claim were dependent "on claim 20 or any other claim" or the like, it could be re-drafted as dependent on claim 1, claim 15, or even claim 25 (if such were to exist) if desired and still fall with the disclosure. It should be understood that this phrase also provides support for any combination of elements in the claims and even incorporates any desired proper antecedent basis for certain claim combinations such as with combinations of method, apparatus, process, and the like claims.
Finally, any claims set forth at any time are hereby incorporated by reference as part of this description of the inventive technology, and the applicant expressly reserves the right to use all of or a portion of such incorporated content of such claims as additional description to support any of or all of the claims or any element or component thereof, and the applicant further expressly reserves the right to move any portion of or all of the incorporated content of such claims or any element or component thereof from the description into the claims or vice-versa as necessary to define the matter for which protection is sought by this application or by any subsequent continuation, division, or continuation-in-part application thereof, or to obtain any benefit of, reduction in fees pursuant to, or to comply with the patent laws, rules, or regulations of any country or treaty, and such content incorporated by reference shall survive during the entire pendency of this application including any subsequent continuation, division, continuation-in-part application thereof or any reissue or extension thereon.

Claims

CLAIMS What is claimed is:
1. A computerized framework for generating a catastrophe operational risk readiness index comprising:
• a risk management database for storing a plurality of insurance preparedness foundation criteria, a plurality of insurance preparedness enabler criteria, and an insurer risk readiness question set for each said criterion;
• an insurer question interface responsive to said risk management database for posing each question of each said insurer risk readiness question set to an insurer;
• an insurer response interface for receiving responses to each said posed
question from said insurer;
• an evaluative response scoring module responsive to said insurer response interface for assigning an evaluative response score to each said insurer response;
• an evaluative criterion scoring module responsive to said evaluative response scoring module for assigning an evaluative criterion score to each said criterion;
• an evaluative index scoring module responsive to said evaluative criterion scoring module for assigning a catastrophe operational risk readiness index for said insurer;
• an output display for displaying said catastrophe operational risk readiness index.
2. A computerized framework for generating a catastrophe operational risk readiness index as described in claim 1 wherein said foundational criteria comprise:
• a leadership and culture criterion factor analytic module; • a customer experience criterion factor analytic module; and
• a people criterion factor analytic module; and wherein said enabler criteria comprise:
• a measurement and analytics criterion factor analytic module;
• a claims response agility criterion factor analytic module;
• a reputation criterion factor analytic module;
• an insight and planning criterion factor analytic module;
• a sourcing, partnering, and procurement criterion factor analytic module;
• a management information and information technology criterion factor
analytic module;
• a content management criterion factor analytic module;
• and a channel and media criterion factor analytic module.
A computerized framework for generating a catastrophe operational risk readiness index as described in claim 1 wherein said foundational criteria comprise at least one foundational criterion selected from the group consisting of:
• a leadership and culture criterion factor analytic module;
• a customer experience criterion factor analytic module; and
• a people criterion factor analytic module; and wherein said enabler criteria comprise at least one criterion selected from the group consisting of:
• a measurement and analytics criterion factor analytic module;
• a claims response agility criterion factor analytic module; • a reputation criterion factor analytic module;
• an insight and planning criterion factor analytic module;
• a sourcing, partnering, and procurement criterion factor analytic module;
• a management information and information technology criterion factor
analytic module;
• a content management criterion factor analytic module;
• and a channel and media criterion factor analytic module.
A computerized framework for generating a catastrophe operational risk readiness index as described in claim 1 wherein at least one said criterion further comprises at least one sub-criterion.
A computerized framework for generating a catastrophe operational risk readiness index as described in claim 4 wherein said at least one sub-criterion comprises a sub-criterion selected from the group consisting of:
• for a leadership and culture criterion factor analytic module:
• a leadership sub-criterion factor analytic module;
• a governance sub-criterion factor analytic module;
• for a customer experience criterion factor analytic module:
• a customer understanding sub-criterion factor analytic module;
• a customer engagement sub-criterion factor analytic module;
• for a people criterion factor analytic module:
• a people engagement sub-criterion factor analytic module;
• a capability and capacity sub-criterion factor analytic module;
• a people environment sub-criterion factor analytic module; • for a measure and analytics criterion factor analytic module:
• a measurement system sub-criterion factor analytic module;
• an analysis review and improvement sub-criterion factor analytic module;
• for a claims response criterion factor analytic module:
• a claims response strategy and claims management work system sub- criterion factor analytic module;
• a claims process management sub-criterion factor analytic module;
• for a brand and reputation criterion factor analytic module:
• a reputational risk identification and consensus sub-criterion factor
analytic module;
• a reputational risk strategy sub-criterion factor analytic module;
• a reputational risk management sub-criterion factor analytic module;
• for an insight and planning criterion factor analytic module:
• an insight sub-criterion factor analytic module;
• a planning sub-criterion factor analytic module;
• for a sourcing and partnering criterion factor analytic module:
• a sourcing sub-criterion factor analytic module;
• a partnering sub-criterion factor analytic module;
• a procurement sub-criterion factor analytic module;
• for a technology and systems criterion factor analytic module:
• a management of information sub-criterion factor analytic module;
• an information technology sub-criterion factor analytic module; • for a content management criterion factor analytic module:
• a content management and content management system sub-criterion
factor analytic module;
• a content management lifestyle process sub-criterion factor analytic
module;
• for a channel and media criterion factor analytic module:
• a channel management sub-criterion factor analytic module; and
• a media choice sub-criterion factor analytic module.
A computerized framework for generating a catastrophe operational risk readiness index as described in claim 4 wherein at least one said sub-criterion further comprises at least one attribute.
A computerized framework for generating a catastrophe operational risk readiness index as described in claim 6 wherein said at least one attribute comprises an attribute selected from the group consisting of:
• for a leadership sub-criterion factor analytic module:
• an objective setting attribute activity integration module;
• a decision execution attribute activity integration module;
• a direction setting attribute activity integration module;
• a direction alignment attribute activity integration module;
• an accountability attribute activity integration module;
• an engagement and performance attribute activity integration module;
• a learning and development attribute activity integration module;
• for a governance sub-criterion factor analytic module: • an organization governance attribute activity integration module;
• a legal and ethical behavior attribute activity integration module;
• a key communities attribute activity integration module;
• for a customer understanding sub-criterion factor analytic module:
· a access to and use of customer data attribute activity integration module;
• a voice of the customer attribute activity integration module;
• a customer advocacy attribute activity integration module;
• for a customer engagement sub-criterion factor analytic module:
• a customer centricity attribute activity integration module;
· a product service and support attribute activity integration module;
• for a people engagement sub-criterion factor analytic module:
• a people learning and development attribute activity integration module;
• a people satisfaction and engagement attribute activity integration module;
• a people engagement assessment attribute activity integration module; · for a capability and capacity sub-criterion factor analytic module:
• a capability attribute activity integration module;
• a capacity attribute activity integration module;
• for a people environment sub-criterion factor analytic module:
• a physical environment attribute activity integration module;
· a policy environment attribute activity integration module;
• for a measurement system sub-criterion factor analytic module: • a performance management attribute activity integration module;
• for an analysis review and improvement sub-criterion factor analytic module:
• an analysis and review attribute activity integration module;
• an improvement attribute activity integration module;
• for a claims response strategy and claims management work system sub- criterion factor analytic module:
• a claims management work system design attribute activity integration module;
• a key claims work processes attribute activity integration module;
• for a claims process management sub-criterion factor analytic module:
• a claims process design attribute activity integration module;
• claims process maintenance and management attribute activity integration module;
• a voice of claims process attribute activity integration module;
• a claims process improvement attribute activity integration module;
• for a reputational risk identification and consensus sub-criterion factor
analytic module:
• a reputational risk identification attribute activity integration module;
• a reputational risk consensus attribute activity integration module;
• a reputational risk stakeholder involvement attribute activity integration module;
• for a reputational risk strategy sub-criterion factor analytic module:
• a reputational risk strategy development process attribute activity integration module;
• a reputation strategic objectives attribute activity integration module;
• for a reputational risk management sub-criterion factor analytic module:
• a reputational risk management framework attribute activity integration module;
• a reputation management governance attribute activity integration module;
• for an insight sub-criterion factor analytic module:
• an insight adoption and categorization attribute activity integration
module;
• an insight utilization process attribute activity integration module;
• for a planning sub-criterion factor analytic module:
• a planning process attribute activity integration module;
• a statement of CORR objectives attribute activity integration module;
• a deployment and alignment attribute activity integration module;
• a measurement attribute activity integration module;
• for a sourcing sub-criterion factor analytic module:
• a strategic choice attribute activity integration module;
• a determining factors attribute activity integration module;
• an update and alignment attribute activity integration module;
• for a partnering sub-criterion factor analytic module:
• a strategic allocation attribute activity integration module;
• an allocation factors attribute activity integration module; • a update and alignment attribute activity integration module;
• for a procurement sub-criterion factor analytic module:
• a procurement process attribute activity integration module;
• a process value and risk profile attribute activity integration module;
• an update and alignment attribute activity integration module;
• a process value and risk profile attribute activity integration module;
• for a management of information sub-criterion factor analytic module:
• a management of information attribute activity integration module;
• a knowledge management attribute activity integration module;
• for an information technology sub-criterion factor analytic module:
• a management of information technology attribute activity integration module;
• an information technology currency attribute activity integration module;
• for a content management and content management system sub-criterion factor analytic module:
• a content definition attribute activity integration module;
• a content access and usability attribute activity integration module;
• a content availability attribute activity integration module;
• an open and distributable attribute activity integration module;
• a customizable attribute activity integration module;
• a measurable attribute activity integration module;
• a security attribute activity integration module; • a technology attribute activity integration module;
• for a content management lifecycle process sub-criterion factor analytic
module:
• a process attribute activity integration module;
• a content retrieval attribute activity integration module;
• a content categorization attribute activity integration module;
• a content distribution attribute activity integration module;
• for a channel management sub-criterion factor analytic module:
• a channel management philosophy and approach attribute activity
integration module;
• a channel management goals and objectives attribute activity integration module;
• a channel management process attribute activity integration module;
• for a media choice sub-criterion factor analytic module:
• a media management philosophy and strategy attribute activity integration module;
• a media management process attribute activity integration module;
• a media choice and categorization attribute activity integration module.
A computerized framework for generating a catastrophe operational risk readiness index as described in claim 6 wherein each said criterion, sub-criterion, and attribute corresponds to at least one characteristic of said insurer.
A computerized framework for generating a catastrophe operational risk readiness index as described in claim 1 wherein each said question comprises a question that tests for at least one of said criteria, sub-criteria, or attributes.
10. A computerized framework for generating a catastrophe operational risk readiness index as described in claim 9 wherein said criteria, sub-criteria, and attributes comprise criteria, sub-criteria, and attributes that are customizable to individual characteristics of said insurer.
11. A computerized framework for generating a catastrophe operational risk readiness index as described in claim 1 wherein each said scoring module comprises a computerized weighted scoring rubric module.
12. A computerized framework for generating a catastrophe operational risk readiness index as described in claim 11 wherein said computerized weighted scoring rubric module comprises a computerized weighted scoring rubric module utilizing a differential weight for one of at least one of said criteria, at least one sub-criteria, and at least one attribute.
13. A computerized framework for generating a catastrophe operational risk readiness index as described in claim 1 wherein said evaluative index scoring module comprises an evaluative index scoring module utilizing a scoring rubric that weights each said foundation criterion at 20 percent and each said enabler criterion at 5 percent.
14. A computerized framework for generating a catastrophe operational risk readiness index as described in claim 1 wherein each said scoring module comprises a scoring module utilizing at least one scoring rubric selected from the group consisting of an insurer logic rubric, a question type rubric, a proficiency evidence rubric, a summing rubric, and an averaging rubric.
15. A computerized framework for generating a catastrophe operational risk readiness index as described in claim 1 wherein each said insurer risk readiness question set comprises a set of automatic adaptive question modules.
16. A computerized framework for generating a catastrophe operational risk readiness index as described in claim 1 wherein said catastrophe risk readiness index comprises a standardized catastrophe risk readiness index.
A computer architecture for insurer response validity assessment in the determination of a catastrophe operational risk readiness index comprising:
• a risk management database for storing a plurality of substantive risk
readiness questions and a plurality of evidentiary validity assessment questions correlated to said substantive risk readiness questions;
• an insurer question interface responsive to said risk management database for posing each said substantive risk readiness question and each said evidentiary validity assessment question to said insurer;
• an insurer response interface for receiving substantive risk readiness responses and evidentiary validity assessment responses from said insurer;
• an evaluative substantive risk readiness response scoring module responsive to said insurer response interface for assigning a substantive risk readiness question score to each said insurer substantive risk readiness response;
• an evaluative evidentiary validity assessment response scoring module
responsive to said insurer response interface for assigning an evidentiary validity assessment score to each said insurer evidentiary validity assessment response;
• a validity assessment module for utilizing said substantive risk readiness
question score and said evidentiary validity assessment score to assess insurer response validity;
• an output display for displaying said insurer response validity assessment.
A computer architecture for insurer response validity assessment in the determination of a catastrophe operational risk readiness index as described in claim 17 wherein said plurality of evidentiary validity assessment questions correlated to said substantive risk readiness questions comprise evidentiary validity assessment questions that evaluate the validity of insurer responses to said substantive risk readiness questions.
19. A computer architecture for insurer response validity assessment in the
determination of a catastrophe operational risk readiness index as described in claim 18 wherein said substantive risk readiness question score comprises a measure of insurer risk readiness.
20. A computer architecture for insurer response validity assessment in the
determination of a catastrophe operational risk readiness index as described in claim 19 wherein said evidentiary validity assessment score comprises a measure of insurer response validity.
21. A computer architecture for insurer response validity assessment in the
determination of a catastrophe operational risk readiness index as described in claim 20 wherein said insurer response validity assessment comprises an assessment selected from the group consisting of a validation of said insurer responses and a disqualification of said insurer responses.
22. A computer architecture for insurer response validity assessment in the
determination of a catastrophe operational risk readiness index as described in claim 17 wherein said substantive risk readiness questions comprise an approach question, a deployment question, a review question, and a linkage question.
23. A computer architecture for insurer response validity assessment in the
determination of a catastrophe operational risk readiness index as described in claim 22 wherein said evaluative substantive risk readiness response scoring module utilizes a scoring rubric that equally weighs said insurer responses and takes a simple average of said insurer responses to assign said substantive risk readiness question score.
24. A computer architecture for insurer response validity assessment in the
determination of a catastrophe operational risk readiness index as described in claim 17 wherein said evidentiary validity assessment questions comprise at least two evidentiary validity assessment questions.
25. A computer architecture for insurer response validity assessment in the determination of a catastrophe operational risk readiness index as described in claim 24 wherein said evaluative evidentiary validity assessment response scoring module utilizes a scoring rubric that equally weighs said insurer responses and takes a simple average of said insurer responses to assign said evidentiary validity assessment score.
A computer architecture for insurer response validity assessment in the determination of a catastrophe operational risk readiness index as described in claim 17 wherein said validity assessment module comprises a threshold analysis validity assessment module.
A computer architecture for insurer response validity assessment in the determination of a catastrophe operational risk readiness index as described in claim 26 wherein said threshold analysis validity assessment module utilizes a threshold analysis consisting of:
• utilizing the substantive risk readiness score for the validity assessment if the evidentiary validity assessment score is equal to or differs by no more than 10 percent of the substantive risk readiness score;
• utilizing the evidentiary validity assessment score for the validity assessment if the evidentiary validity assessment score differs from the substantive risk readiness score by an amount greater than 10 percent but no more than 20 percent of the substantive risk readiness score;
• disqualifying said insurer responses if the evidentiary validity assessment score differs from the substantive risk readiness score by more than 20 percent of the substantive risk readiness score.
A method for evaluating preparedness of an organization for a catastrophic event, comprising:
• providing a user with a catastrophe operational risk readiness (CORR) index assessment question set, the question set including a plurality of questions related to one or more criteria and/or attributes of the organization; • receiving responses from the user to the question set;
• compiling and scoring the responses; and
• determining component and overall scores for CORR based on the response scores and predetermined weighting associated with the one or more criteria and/or attributes.
29. The method of claim 28, wherein questions of the question set are provided
according to a prescribed order and methodology.
30. The method of claim 28, wherein compiling and scoring responses comprises verifying the information contained in the responses and scoring the responses according to a scoring rubric.
31. The method of claim 28, wherein the question set comprises questions organized in a framework for evaluation of the organization.
32. The method of claim 31, wherein the framework comprises main criteria, sub- criteria, and specific attributes for each sub-criteria.
33. The method of claim 32, wherein the question set comprises a list of detailed questions related to the current performance of the organization against the framework.
34. The method of claim 32, wherein a first subset of attributes are weighted more heavily than a second set of attributes.
35. The method of claim 34, wherein the first set of attributes correspond to
documented processes associated with an operational risk associated with a catastrophic event.
36. The method of claim 34, wherein the second set of attributes correspond to a deployment of a process associated with a response to a catastrophic event.
37. The method of claim 34, wherein the first set of attributes correspond to
technology and systems required for operation of the organization, and the second set of attributes correspond to the brand and reputation of the organization.
The method of claim 28, further comprising developing an improvement plan for improvement of the organization catastrophe operational risk readiness based on the component and overall scores.
PCT/US2014/033079 2013-04-25 2014-04-04 Computerized indexing of catastrophic operational risk readiness WO2014176018A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361816025P 2013-04-25 2013-04-25
US61/816,025 2013-04-25

Publications (1)

Publication Number Publication Date
WO2014176018A1 true WO2014176018A1 (en) 2014-10-30

Family

ID=51792297

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/033079 WO2014176018A1 (en) 2013-04-25 2014-04-04 Computerized indexing of catastrophic operational risk readiness

Country Status (1)

Country Link
WO (1) WO2014176018A1 (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5909669A (en) * 1996-04-01 1999-06-01 Electronic Data Systems Corporation System and method for generating a knowledge worker productivity assessment
US20060009992A1 (en) * 2004-07-02 2006-01-12 Cwiek Mark A Method and system for assessing a community's preparedness, deterrence, and response capability for handling crisis situations
US20090076894A1 (en) * 2007-09-13 2009-03-19 Cary Lee Bates Advertising in Virtual Environments Based on Crowd Statistics
US20090326989A1 (en) * 2005-03-23 2009-12-31 Schmitt Brett A Interactive information management system and method
US7752054B1 (en) * 2000-05-04 2010-07-06 Microsoft Corporation Advisor referral tool
US20100211413A1 (en) * 2009-02-18 2010-08-19 Emergis Inc. Revising containerized processing logic for use in insurance claim processing
US7809595B2 (en) * 2002-09-17 2010-10-05 Jpmorgan Chase Bank, Na System and method for managing risks associated with outside service providers
US7818152B2 (en) * 2001-07-20 2010-10-19 International Business Machines Corporation Computerized method and system for maturity assessment of business processes
US7844475B1 (en) * 2001-02-06 2010-11-30 Makar Enterprises, Inc. Method for strategic commodity management through mass customization
US20110166900A1 (en) * 2010-01-04 2011-07-07 Bank Of America Corporation Testing and Evaluating the Recoverability of a Process
US20110257961A1 (en) * 2010-04-14 2011-10-20 Marc Tinkler System and method for generating questions and multiple choice answers to adaptively aid in word comprehension
US8175899B2 (en) * 2002-09-20 2012-05-08 Fortis, Inc. Systems and methods for processing a request to repair or replace an item covered by insurance
US8239677B2 (en) * 2006-10-10 2012-08-07 Equifax Inc. Verification and authentication systems and methods
US20120260201A1 (en) * 2011-04-07 2012-10-11 Infosys Technologies Ltd. Collection and analysis of service, product and enterprise soft data
US8326654B2 (en) * 2007-04-23 2012-12-04 Hewlett-Packard Development Company, L.P. Providing a service to a service requester
US8396719B2 (en) * 2005-03-25 2013-03-12 J2 Global Communications Real-time customer service assistance using collected customer life cycle data
US8401878B2 (en) * 2009-01-06 2013-03-19 Mark Stender Method and system for connecting an insured to an insurer using a mobile device
US8401896B2 (en) * 2005-11-01 2013-03-19 Accenture Global Services Limited Automated task processor for insurance claims

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5909669A (en) * 1996-04-01 1999-06-01 Electronic Data Systems Corporation System and method for generating a knowledge worker productivity assessment
US7752054B1 (en) * 2000-05-04 2010-07-06 Microsoft Corporation Advisor referral tool
US7844475B1 (en) * 2001-02-06 2010-11-30 Makar Enterprises, Inc. Method for strategic commodity management through mass customization
US7818152B2 (en) * 2001-07-20 2010-10-19 International Business Machines Corporation Computerized method and system for maturity assessment of business processes
US7809595B2 (en) * 2002-09-17 2010-10-05 Jpmorgan Chase Bank, Na System and method for managing risks associated with outside service providers
US8175899B2 (en) * 2002-09-20 2012-05-08 Fortis, Inc. Systems and methods for processing a request to repair or replace an item covered by insurance
US20060009992A1 (en) * 2004-07-02 2006-01-12 Cwiek Mark A Method and system for assessing a community's preparedness, deterrence, and response capability for handling crisis situations
US20090326989A1 (en) * 2005-03-23 2009-12-31 Schmitt Brett A Interactive information management system and method
US8396719B2 (en) * 2005-03-25 2013-03-12 J2 Global Communications Real-time customer service assistance using collected customer life cycle data
US8401896B2 (en) * 2005-11-01 2013-03-19 Accenture Global Services Limited Automated task processor for insurance claims
US8239677B2 (en) * 2006-10-10 2012-08-07 Equifax Inc. Verification and authentication systems and methods
US8326654B2 (en) * 2007-04-23 2012-12-04 Hewlett-Packard Development Company, L.P. Providing a service to a service requester
US20090076894A1 (en) * 2007-09-13 2009-03-19 Cary Lee Bates Advertising in Virtual Environments Based on Crowd Statistics
US8401878B2 (en) * 2009-01-06 2013-03-19 Mark Stender Method and system for connecting an insured to an insurer using a mobile device
US20100211413A1 (en) * 2009-02-18 2010-08-19 Emergis Inc. Revising containerized processing logic for use in insurance claim processing
US20110166900A1 (en) * 2010-01-04 2011-07-07 Bank Of America Corporation Testing and Evaluating the Recoverability of a Process
US20110257961A1 (en) * 2010-04-14 2011-10-20 Marc Tinkler System and method for generating questions and multiple choice answers to adaptively aid in word comprehension
US20120260201A1 (en) * 2011-04-07 2012-10-11 Infosys Technologies Ltd. Collection and analysis of service, product and enterprise soft data

Similar Documents

Publication Publication Date Title
Aghion et al. Coase lecture‐the inverted‐U relationship between credit access and productivity growth
Gillis et al. Big data and discrimination
Marshall et al. Going above and beyond: how sustainability culture and entrepreneurial orientation drive social sustainability supply chain practice adoption
Gallery et al. Financial literacy and pension investment decisions
Barnett et al. Measuring the Impact and Value for Money of Governance & Conflict Programmes
Parast et al. Corporate social responsibility, benchmarking, and organizational performance in the petroleum industry: A quality management perspective
Lorences et al. The evaluation and improvement of IT governance
Silva et al. Uncertainty, flexibility and operational performance of companies: modelling from the perspective of managers
US20170185934A1 (en) Enterprise value assessment tool
Faulk et al. As you sow, so shall you reap? Evaluating whether targeted capacity building improves nonprofit financial growth
John Effect of Green Supply Chain Management Practices on the performance of manufacturing firms in Kenya
Li et al. Born to rebel? The owner birth order and R&D investments in Chinese family firms
Huang et al. Inventing a business-ERP alignment assessment model through three Japanese companies
Noorvee Evaluation of the effectiveness of internal control over financial reporting
Jordan Logic modeling: A tool for designing program evaluations
WO2014176018A1 (en) Computerized indexing of catastrophic operational risk readiness
Wolfe et al. The effect of partition dependence on assessing accounting estimates
Searcy et al. Integrating sustainable development indicators with existing business infrastructure
Tirado-Beltrán et al. Risk information disclosure and its impact on analyst forecast accuracy
Proksch The integrated reporting journey and the influence of sustainability reporting-experiences and practices from Sweden and Germany
Kollmuss et al. How offset programs assess and approve projects and credits
Rabbani et al. Financial risk tolerance
Junttila et al. A Business Continuity Management Maturity Model
Louw The Value Relevance of Goodwill and its Disclosure for Companies Listed on the JSE
Onuwa Quality management practices and organisational performance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14788238

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14788238

Country of ref document: EP

Kind code of ref document: A1