Professional standards and the Daubert standard

Standards: Proof and verification

Modeling doesn’t have to be limited to quantitative areas. Models can be qualitative. A good subject to illustrate this is the concept of a “professional standard”. This is a widely discussed topic, but I’m hoping to look at it from the perspective of looking at a relevant class of theoretical models, proof systems, and use concepts from them to discuss professional standards and a particular legal standard potentially applicable to your professional work.

Continue reading “Professional standards and the Daubert standard”

NYC Wind Speed – Fitting Distributions

Windy Day

Risk Modeling – NYC Wind

I just posted my first Mathematica model today.  It demonstrates modeling NYC Wind Speeds.  Look for it under the new Model entry under the main menu.  It is stored as both a notebook (.nb) and as a computable document (.cdf).  To use the CDF, you will need to install the Wolfram Mathematica CDF Player.

I am currently using Mathematica 11.2 and the notebook and CDF are  saved in Dropbox.

 

Windy Day
It is a little windy today isn’t it?

Description

The workbook reads in the maximum wind speeds from NYC using the WeatherData Mathematica function. Those values are converted from km/hour to mph. From that converted data, Mathematica then fits several different statistical distributions and displays that fit.  I chose these distributions because of their various properties, such as positive support including infinite support such as normal or log-normal.  I also included the simplest distribution used within ERM, which is the triangular distribution.  I also fit the extreme value distribution for modeling extreme winds.  However, I find that these distributions don’t seem to get wind speeds in excess of 100mph, which is the certified wind speed protection that is required by NYC skyscrapers.

I also use the Mathematica function FindDistribution to find the best ten distribution to fit the data as well.  Here we look at the maximum, mean and the 98% quantiles of these ten distributions and examine the Economic Capital metric.   Even though Economic capital doesn’t make sense with a wind speed model, it is a means to determine what wind speed above the average wind speed would occur in a 1 in 50 year event.  This is measured by the 98% percentile of a distribution less the mean of a distribution.  Since,  98% = 100% – 1/50, the 98% percentile would tell you what the speed would be in a 1 in 50 year event.  The excess of this over the mean, would be the excess wind speed above the mean, that you would need to address, if you wanted to cover a 1 in 50 events.

Wind Risk Links

Below are several useful wind risk links:

Windstorms and Tornadoes Hazard Analysis for New York City

SEVERE WEATHER: THUNDERSTORMS, TORNADOES, AND WINDSTORMS for NYC

NYC – Coastal Storms Risk Assessment

Sandy spared New York’s skyscrapers, but high-rises carry high risk during hurricanes

List of NY Hurricanes

The 1893 NY Hurricane and the Disappearance of Hog Island

Skyscrapers May Shiver and Sway, but They’re Perfectly Safe (Just Stay Away From the Windows)

ATC Wind Speed by Location

Severe Wind Gust Risk for Australian Capital Cities – A National Risk Assessment approach

A simulation model for assessing bird wind turbine collision risk

Model-Based Estimation of Collision Risks of Predatory Birds with Wind Turbines

Managing Wind Pool Risk with Portfolio Optimization

Wind Gust Forecasting: Managing Risk to Construction Projects

 

 

Model Control Life Cycle

Life Cycle

At the center of ERM is the implementation of the model control life cycle. There are four components:

  1. Analyze and determine the key risks to which an entity is exposed,
  2. Design and implement models to estimate the impact of the risks,
  3. Simulate and aggregate and allocate results to quantify the capital impact of the risks, and
  4. Evaluate, report, and determine the strengths and weaknesses of the models. Once these steps are complete, you return to step 1 to determine how to improve the models or how to add another risk to the existing set of models.

So, as you continue to pursue your career in enterprise risk management, you will find that multiple skills are required to implement and maintain the ERM model life cycle. The first is to have the ability to examine an entity such as a company, line of business or a country and determine various risks to which that entity is exposed. This risk assessment skill is central to step one above. Also, using risk assessments, you will determine which risks will be in or out of scope for that specific model cycle.

After determining which risks are in scope, the second skill you develop is the ability to design and implement the models to estimate the impact of these risks, which meets step two of the life cycle. The final skill emphasized in this blog is the ability to use the models to simulate and aggregate the results, which corresponds to step three.

 

 

Model Use (Post 3)

After a corporate model is constructed the practitioner uses the results in several ways. Some of these are:

  1. Gain insight on the business modeled.
  2. Determine risks to the company.
  3. Observe the scenarios that give adverse model results.
  4. Increase reserves, create hedges or make product enhancements to reduce the risk exposure or adverse results.

The internal company standards and the external regulatory controls require the practitioner to determine risk levels from corporate models. It is of paramount importance to understand the impact that different economic drivers, product designs or investment/disinvestment strategies have on the behavior of a corporate model. This includes the determination of when (and how often) model results from scenarios fall in ‘bad’ locations. This knowledge allows one to interpret the potential magnitude of the company’s risk exposure.  While adverse results occur relatively infrequently in scenario testing (unless alternative volatility assumptions are considered), the practitioner desires to gain more knowledge of these adverse results without paying the cost of projecting additional scenarios to increase the number of “hits” in the region of adverse results needed for statistical validity.

These adverse locations are discovered by first placing a valuation of economic capital on the company’s position, scenario by scenario. These valuations are then sorted and put in an increasing or decreasing order. From these ordered results, the location of the adverse results is found at either the highest or lowest valuations. The study and analysis of ordered or sorted samples is done using either order or extreme value statistics or the theory of records. Due to modeling cost, we have a need to approximate the relationship between the input economic scenarios and the EC output results without additional computer processing. Also, if one is able to target the location of adverse results when developing this relationship, all the better.

Through a model office or a corporate model and more-so our understanding arising from the use of those models strengthens our decision making. Frequently, we make reasoned decisions using a few deterministic scenarios instead of a full suite of stochastic scenarios, however, though we understand the underlying mechanics, we do not understand the likelihood of the impact of a risk unless we use a larger suite. As we use more scenarios, complexity increases and we may lose our understanding of the mechanics and we encounter the proverbial situation of not being able to see the forest because of all of the trees in view. But ignorance that arises from complexity is not always a bad thing. It forces the modeler or the business professional to broaden their skill set to gain deeper insight and this leads to further product improvements or at least an understanding of model limitations.

From the advance of technology there are new techniques from predictive analytics or data science that can be applied to these complex situations, and allow us draw understanding between the scenario input and the corporate results.

 

Model Use (Continued)

It is important to keep a clear perspective when using multiple economic scenarios in computer simulations. We can gain significant insight about the risk exposure from the economy using stochastic simulation. By examining multiple possibilities, we can protect ourselves as best as feasible. We realize that only one path actually emerges as in the recent economic meltdown. Therefore, the practitioner must continually evaluate the economy and make reasoned business decisions to maintain existing business and to acquire new business.

The risk appetite of company management must also govern these business decisions. Insolvency must be considered and avoided. However, the practitioner cannot remove all risk of insolvency, because the cost of the
associated hedges becomes so prohibitive that the company is unable to conduct business. Accordingly, the practitioner should understand where the product or business line places the company at risk and be able to communicate to upper management the specific risk exposure.

ERM practitioners, valuation actuaries, asset/liability management actuaries, CFOs and CROs of insurance companies confront issues that are vast and complex, including:

  1. Calculating the probability and/or impact of bankruptcy either by scenario testing or by determining the company’s value at risk.
  2. Determining the initial capital allocation for a new line of business.
  3. Assuring that reserves\index{Reserves} are adequate for new and existing lines of business.
  4. Understanding how different lines of business are sensitive to the level of interest rates\index{Interest Rates}, corporate spreads, volatility of other economic
    indicators (such as stock indices), and the changes in the levels of these variables.
  5. Estimating other risks to which the company is exposed in a timely fashion.
  6. Pricing complex policy features to obtain profitability, while maintaining a competitive market position.
  7. Aiding in the design and pricing of dynamic hedges to reduce the risk of extreme events.
  8. Designing and pricing the securitization of various cashflows to reduce risk based capital requirements and various types of reserves such as XXX or AXXX.
  9. Revising and designing investment strategies to improve the return on assets that back company liabilities.

All of the above issues require timely and accurate valuation of different complex corporate models. When conducting the analysis on models the practitioner goes through the following model life cycle:

  1. Collect relevant data.
  2. Make relevant assumptions.
  3. Construct the model.
  4. Validate the model for reasonableness.
  5. Apply the model to solve a problem or understand the impact of changing conditions.
  6. Revise the model.

Model Use – Strengths and Weaknesses

In industry, regulation and/or professional standards require us to conduct computer simulations on different lines of business to determine when the business performs poorly. We model our business as accurately as possible, allowing for interest and asset performance, changing prices and expense loads. In addition, we often make many other assumptions such as the term structure of interest rates, future interest rates, projected stock market returns, asset default probabilities, psychology, and the relationships of our decrements to the level of interest rates or the stock market. Computer simulations reveal the behavior of the business relative to these assumptions. We do not know the actual statistical distribution of our business model results. We assume that the computer simulation results are representative (within some degree of confidence) in certain areas of interest, such as the extreme tail. We need to determine if our models are valid (again within some degree of confidence). If valid, then we calculate either economic capital or stand-alone capital within the accuracy of these computer models. In addition, we want to observe the potential risks associated with either the enterprise, product or line of business.

Computer simulations of complex corporate models become very expensive in processing time as the number of scenarios increases. The need to obtain a timely answer often outweighs the need for information from additional scenarios.

Most computer business models are limited by the knowledge that we have about the basic assumptions used. We must be careful in how we think about and use these models. At a fundamental level, the models are neither correct nor assumed to be accurate. However, the benefit of using the computer to model actual business products and lines is that we can obtain an understanding of the different risks to which that product or line is exposed. Once we have this understanding, we can consider several methods to reduce the impact of any given risk. Such methods include product redesign, reserve strengthening, deferred expense write downs, asset hedging strategies, stopping rules (rules that recommend when to get out of a market), derivative positions, or over-capitalization.

Once we gain basic understanding of the risks and design, say, a hedging strategy, we must remember that these models are not accurate, due to oversimplification of the model, lack of knowledge and insight, lack of confidence in the assumptions, or incorrect computer code. We cannot trust the model output as the “truth,” but we can trust the knowledge and insight that we gain from the process of modeling. If done correctly we know both the strengths and weaknesses of the model. For instance, when constructing a hedge to protect against the risks demonstrated by the model, we must not implement a hedge that optimizes against areas of model weakness. Ultimately, the model does not tell us what to do, but the model does make us more informed to make better business decisions.