4. Reference

Options to autoclass etc.

  • undoc-members: “document” members with no doc strings

  • private-members: doc _ and __ members

  • special-members: doc __name__ members

  • inherited-members

Underwriter Module

Underwriter Class

class aggregate.underwriter.Underwriter(databases=None, update=False, log2=10, debug=False, create_all=False)[source]

The Underwriter class manages the creation of Aggregate and Portfolio objects, and maintains a database of standard Severity (curves) and Aggregate (unit or line level) objects. The Underwriter knows about all the business that is written!

  • Handles persistence to and from agg files

  • Is interface into program parser

  • Handles safe lookup from database for parser

Objects have a kind (agg, port, or sev) and a name. E.g. agg MyAgg … has kind agg and name MyAgg. They have a representation as a program. When the program is interpreted it produces a string spec that can be used to create the object. The static method factory can create any object from the (kind, name, spec, program) quartet, though, strictly, program is not needed.

The underwriter knowledge is stored in a dataframe indexed by kind and name with columns spec and program.

__init__(databases=None, update=False, log2=10, debug=False, create_all=False)[source]

Here is doc for init.

Parameters
  • dir_name

  • name

  • databases – if None: nothing loaded; if ‘default’ or ‘default’ in databases load the installed

databases; if ‘site’ or site in databases, load all site databases (home() / agg, which is created it it does not exist); other entires treated as db names in home() / agg are then loaded. Databases not in site directory must be fully qualified path names. :param update: :param log2: :param debug: run parser in debug mode :param create_all: by default write only creates portfolios.

_read_db(db_path)[source]
_repr_html_()[source]
build(program, update=True, create_all=None, log2=- 1, bs=0, log_level=None, **kwargs)[source]

Convenience function to make work easy for the user. Intelligent auto updating. Detects discrete distributions and sets bs = 1.

build method sets loger level to 30 by default.

__call__ is set equal to build.

Parameters
  • program

  • update – build’s update

  • create_all – for just this run

  • log2 – -1 is default. Figure log2 for discrete and 13 for all others. Inupt value over-rides

and cancels discrete computation (good for large discrete outcomes where bucket happens to be 1.) :param bs: :param log_level: :param kwargs: passed to update, e.g., padding. Note force_severity=True is applied automatically :return: created object(s)

describe(kinds=None)[source]

More informative version of list including notes.

TODO: enhance!

Returns

factory(kind, name, spec, program)[source]

Create object of kind from spec, a dictionary. Creating from uw obviously needs the uw, so this is NOT a staticmethod!

Parameters
  • kind

  • name

  • spec

Returns

interpret_program(portfolio_program)[source]

Preprocess and then parse one line at a time.

Error handling through parser.

Old parse_portfolio_program replaced with build.interpret_one and interpret_test, and running write with

Parameters

portfolio_program

Returns

interpreter_file(where='', filename='')[source]

Run a suite of test programs. For detailed analysis, run_one.

interpreter_line(program, name='one off', debug=True)[source]

Interpret single line of code in debug mode. name is index of output

interpreter_list(program_list)[source]

Interpret single test in debug mode.

interpreter_work(iterable, debug=False)[source]

Do all the work for the test, allows input to be marshalled into the tester in different ways. Unlike production interpret_program, runs one line at a time. Each line is preprocessed and then run through a clean parser, and the output analyzed.

Last column, program as input is only changed if the preprocessor changes the program

Returns

DataFrame

property knowledge

Return the knowledge as a nice dataframe

Returns

list()[source]

List summary of the knowledge

Returns

static logger_level(level)[source]

Convenience function. :param level: :return:

safe_lookup(buildinid)[source]

Lookup buildinid=kind.name in uw to find expected kind and merge safely into self.arg_dict.

Different from getitem because it splits the item into kind and name and double checks you get the expected kind.

Parameters

buildinid – a string in kind.name format

Returns

write(portfolio_program, log2=0, bs=0, create_all=None, update=None, **kwargs)[source]

Write a natural language program. Write carries out the following steps.

  1. Read in the program and cleans it (e.g. punctuation, parens etc. are removed and ignored, replace ; with new line etc.)

  2. Parse line by line to create a dictionary definition of sev, agg or port objects.

  3. Replace sev.name, agg.name and port.name references with their objects.

  4. If create_all set, create all objects and return in dictionary. If not set only create the port objects.

  5. If update set, update all created objects.

Sample input

port MY_PORTFOLIO
    agg Line1 20  loss 3 x 2 sev gamma 5 cv 0.30 mixed gamma 0.4
    agg Line2 10  claims 3 x 2 sevgamma 12 cv 0.30 mixed gamma 1.2
    agg Line 3100  premium at 0.4 3 x 2 sev 4 @ lognormal 3 cv 0.8 fixed 1

The indents are required if each agg item appears on a new line.

See parser for full language spec! See Aggregate class for many examples.

Reasonable kwargs:

  • bs

  • log2

  • update overrides class default

  • add_exa should port.add_exa add the exa related columns to the output?

  • create_all: create all objects, default just portfolios. You generally don’t want to create underlying sevs and aggs in a portfolio.

Parameters
  • portfolio_program

  • create_all – override class default

  • update – override class default

  • kwargs – passed to object’s update method if update==True

Returns

single created object or dictionary name: object

write_from_file(file_name, log2=0, bs=0, update=False, **kwargs)[source]

Read program from file. Delegates to write.

Parameters
  • file_name

  • log2

  • bs

  • update

  • kwargs

Returns

build Variable

Parser Module

Lexer Class

class aggregate.parser.UnderwritingLexer[source]

Implements the Lexer for agg language.

BUILTIN_AGG = 'agg\\.[a-zA-Z][a-zA-Z0-9_:~]*'
BUILTIN_SEV = 'sev\\.[a-zA-Z][a-zA-Z0-9_:~]*'
DIVIDE = '/'
EQUAL_WEIGHT = '='
EXPONENT = '\\^|\\*\\*'
FREQ = 'binomial|pascal|poisson|bernoulli|geometric|fixed'
HOMOG_MULTIPLY = '@'
ID = '[a-zA-Z][\\.:~_a-zA-Z0-9]*'
MINUS = '\\-'
NOTE = 'note\\{[^\\}]*\\}'
NUMBER = '(\\d+\\.?\\d*|\\d*\\.\\d+)([eE](\\+|\\-)?\\d+)?'
PERCENT = '%'
PLUS = '\\+'
RANGE = ':'
TIMES = '\\*'
error(t)[source]
ignore = ' \t,\\|'
literals = {'!', '(', ')', '[', ']'}
newline(t)[source]
static preprocess(program)[source]

Separate preprocessor step, allowing it to be called separately. Preprocessing involves six steps:

  1. Remove // comments, through end of line

  2. Remove n in [ ] (vectors) that appear from using f'{np.linspace(...)}'

  3. Semicolon ; mapped to newline

  4. Backslash (line continuation) mapped to space

  5. nt is replaced with space, supporting the tabbed indented Portfolio layout

  6. Split on newlines

Parameters

program

Returns

tokens = {'AGG', 'AGGREGATE', 'AND', 'AT', 'BUILTIN_AGG', 'BUILTIN_SEV', 'CEDED', 'CLAIMS', 'CV', 'DFREQ', 'DIVIDE', 'DSEV', 'EQUAL_WEIGHT', 'EXP', 'EXPONENT', 'FREQ', 'HOMOG_MULTIPLY', 'ID', 'INFINITY', 'LOSS', 'LR', 'MINUS', 'MIXED', 'NET', 'NOTE', 'NUMBER', 'OCCURRENCE', 'OF', 'PART_OF', 'PERCENT', 'PLUS', 'PORT', 'PREMIUM', 'RANGE', 'SEV', 'SHARE_OF', 'TIMES', 'TO', 'TWEEDIE', 'WEIGHTS', 'XPS', 'XS', 'ZM', 'ZT'}

Parser Class

class aggregate.parser.UnderwritingParser(safe_lookup_function, debug=False)[source]

Implements the Parser for agg language.

agg_list(p)[source]
agg_out(p)[source]
agg_reins(p)[source]
answer(p)[source]
atom(p)[source]
builtin_agg(p)[source]
debugfile = WindowsPath('C:/Users/steve/aggregate/parser/parser.out')
dfreq(p)[source]
doutcomes(p)[source]
dprobs(p)[source]
dsev(p)[source]
static enhance_debugfile(f_out='')[source]

Put links in the parser.out debug file, if DEBUGFILE != ‘’

Parameters

f_out – Path or filename of output. If “” then DEBUGFILE.html used.

Returns

error(p)[source]

Default error handling function. This may be subclassed.

exposures(p)[source]
expr(p)[source]
factor(p)[source]
freq(p)[source]
idl(p)[source]
ids(p)[source]
layers(p)[source]
logger(msg, p)[source]
name(p)[source]
note(p)[source]
numberl(p)[source]
numbers(p)[source]
occ_reins(p)[source]
port_out(p)[source]
power(p)[source]
precedence = (('nonassoc', 'LOW'), ('nonassoc', 'LOCATION_ADD'), ('nonassoc', 'SCALE_MULTIPLY', 'HOMOG_MULTIPLY'), ('left', 'PLUS', 'MINUS'), ('left', 'TIMES', 'DIVIDE'), ('right', 'EXP'), ('right', 'EXPONENT'), ('nonassoc', 'PERCENT'))
reins_clause(p)[source]
reins_list(p)[source]
reset()[source]
sev(p)[source]
sev_clause(p)[source]
sev_out(p)[source]
term(p)[source]
tokens = {'AGG', 'AGGREGATE', 'AND', 'AT', 'BUILTIN_AGG', 'BUILTIN_SEV', 'CEDED', 'CLAIMS', 'CV', 'DFREQ', 'DIVIDE', 'DSEV', 'EQUAL_WEIGHT', 'EXP', 'EXPONENT', 'FREQ', 'HOMOG_MULTIPLY', 'ID', 'INFINITY', 'LOSS', 'LR', 'MINUS', 'MIXED', 'NET', 'NOTE', 'NUMBER', 'OCCURRENCE', 'OF', 'PART_OF', 'PERCENT', 'PLUS', 'PORT', 'PREMIUM', 'RANGE', 'SEV', 'SHARE_OF', 'TIMES', 'TO', 'TWEEDIE', 'WEIGHTS', 'XPS', 'XS', 'ZM', 'ZT'}
weights(p)[source]
xps(p)[source]

Remaining Functions

aggregate.parser.grammar(add_to_doc=False, save_to_fn='')[source]

Write the grammar at the top of the file as a docstring

To work with multi rules use ‘, ‘ all on one line

Distribution Classes

Frequency Class

Severity Class

Aggregate Class

CarefulInverse Class

Distortion Class

class aggregate.spectral.Distortion(name, shape, r0=0.0, df=None, col_x='', col_y='', display_name='')[source]

Creation and management of distortion functions.

0.9.4: renamed roe to ccoc, but kept creator with roe for backwards compatibility.

__init__(name, shape, r0=0.0, df=None, col_x='', col_y='', display_name='')[source]

Create a new distortion.

Tester:

ps = np.linspace(0, 1, 201)
for dn in agg.Distortion.available_distortions(True):
    if dn=='clin':
        # shape param must be > 1
        g_dist = agg.Distortion(**{'name': dn, 'shape': 1.25, 'r0': 0.02, 'df': 5.5})
    else:
        g_dist = agg.Distortion(**{'name': dn, 'shape': 0.5, 'r0': 0.02, 'df': 5.5})
    g_dist.plot()
    g = g_dist.g
    g_inv = g_dist.g_inv

    df = pd.DataFrame({'p': ps, 'gg_inv': g(g_inv(ps)), 'g_invg': g_inv(g(ps)),
    'g': g(ps), 'g_inv': g_inv(ps)})
    print(dn)
    print("errors")
    display(df.query(' abs(gg_inv - g_invg) > 1e-5'))
Parameters
  • name – name of an available distortion, call Distortion.available_distortions() for a list

  • shape – float or [float, float]

  • shape – shape parameter

  • r0 – risk free or rental rate of interest

  • df – for convex envelope, dataframe with col_x and col_y used to parameterize or df for t

  • col_x

  • col_y

  • display_name – over-ride name, useful for parameterized convex fix distributions

classmethod available_distortions(pricing=True, strict=True)[source]

List of the available distortions.

Parameters
  • pricing – only return list suitable for pricing, excludes tvar and convex

  • strict – only include those without mass at zero (pricing only)

Returns

static average_distortion(data, display_name, n=201, el_col='EL', spread_col='Spread')[source]

Create average distortion from (s, g(s)) pairs. Each point defines a wtdTVaR with p=s and p=1 points.

Parameters
  • data

  • display_name

  • n – number of s values (between 0 and max(EL), 1 is added

  • el_col – column containing EL

  • spread_col – column containing Spread

Returns

static bagged_distortion(data, proportion, samples, display_name='', random_state=None)[source]

Make a distortion by bootstrap aggregation (Bagging) resampling, taking the convex envelope, and averaging from data.

Each sample uses proportion of the data.

Data must have two columns: EL and Spread

Parameters
  • data

  • proportion – proportion of data for each sample

  • samples – number of resamples

  • display_name – display_name of created distortion

  • random_state – for pd.sample….ensures reproducibility

Returns

static convex_example(source='bond')[source]

Example convex distortion using data from https://www.bis.org/publ/qtrpdf/r_qt0312e.pdf.

Parameters

source – bond gives a bond yield curve example, cat gives cat bond / cat reinsurance pricing based example

Returns

static distortions_from_params(params, index, r0=0.025, df=5.5, pricing=True, strict=True)[source]

Make set of dist funs and inverses from params, output of port.calibrate_distortions. params must just have one row for each method and be in the output format of cal_dist.

Called by Portfolio.

Parameters
  • index

  • params – dataframe such that params[index, :] has a [lep, param] etc. pricing=True, strict=True: which distortions to allow df for t distribution

  • r0 – min rol parameters

  • strict

  • pricing

Returns

plot(xs=None, n=101, both=True, ax=None, plot_points=True, scale='linear', c=None, **kwargs)[source]

Quick plot of the distortion

Parameters
  • xs

  • n – length of vector is no xs

  • both – True: plot g and ginv and add decorations, if False just g and no trimmings

  • ax

  • plot_points

  • scale – linear as usual or return plots -log(gs) vs -logs and inverts both scales

  • kwargs – passed to plot

Returns

static s_gs_distortion(s, gs, display_name='')[source]

Make a convex envelope distortion from {s, g(s)} points.

Parameters
  • s – iterable (can be converted into numpy.array

  • gs

  • display_name

Returns

classmethod test(r0=0.035, df=[0.0, 0.9])[source]

Tester: make some nice plots of available distortions.

Returns

static wtd_tvar(ps, wts, display_name='', details=False)[source]

A careful version of wtd tvar with knots at ps and wts.

Parameters
  • ps

  • wts

  • display_name

  • details

Returns

Utilities

Moment Aggregator Class

Moment Wrangler Class

Axis Manager Class

Utilities Module

Bounds Class

class aggregate.bounds.Bounds(distribution_spec)[source]

Implement IME 2022 pricing bounds methodology.

Typical usage: First, create a Portfolio or Aggregate object a. Then

bd = cd.Bounds(a)
bd.tvar_cloud('line', premium=, a=, n_tps=, s=, kind=)
p_star = bd.p_star('line', premium)
bd.cloud_view(axes, ...)
Param

distribution_spec = Portfolio or Portfolio.density_df dataframe or pd.Series (must have loss as index) If DataFrame or Series values interpreted as desnsity, sum to 1. F, S, exgta all computed using Portfolio methdology If DataFrame line –> p_{line}

__init__(distribution_spec)[source]
cloud_view(axs, n_resamples, scale='linear', alpha=0.05, pricing=True, distortions=None, title='', lim=(- 0.025, 1.025), check=False)[source]

visualize the cloud with n_resamples

after you have recomputed…

if there are distortions plot on second axis

Parameters
  • axs

  • n_resamples – if random sample

  • scale – linear or return

  • alpha – opacity

  • pricing – restrict to p_max = 0, ensuring g(s)<1 when s<1

  • distortions

  • title – optional title (applied to all plots)

  • lim – axis limits

  • check – construct and plot Distortions to check working ; reduces n_resamples to 5

Returns

compute_weight(premium, p0, p1, b=inf, kind='interp')[source]

compute the weight for a single TVaR p0 < p1 value pair

Parameters
  • line

  • premium

  • tp

  • b

Returns

compute_weights(line, premium, n_tps, b=inf, kind='interp')[source]

Compute the weights of the extreme distortions

Applied to min(line, b) (allows to work for net)

Note: independent of the asset level

Parameters
  • line – within port, or total

  • premium – target premium for the line

  • n_tps – number of tvar p points (tps)number of tvar p points (tps)number of tvar p points (tps)number of tvar p points (tps).

  • b – loss bound: compute weights for min(line, b); generally used for net losses only.

Returns

make_ps(n, mode)[source]

Mode are you making s points (always uniform) or tvar p points (use t_mode)? self.t_mode == ‘u’: make uniform s points against which to evaluate g from 0 to 1 inclusive with more around 0 self.t_mode == ‘gl’: make Gauss-Legndre p points at which TVaRs are evaluated from 0 inclusive to 1 exclusive with more around 1

Parameters

n

Returns

make_tvar_function(line, b=inf)[source]

make the tvar function from a Series p_total indexed by loss Includes determining sup and putting in value for zero If sup is largest value in index, sup set to inf

also sets self.Fb

Applies to min(Line, b)

Parameters
  • line

  • b – bound on the losses, e.g., to model limited liability insurer

Returns

p_star(line, premium, b=inf, kind='interp')[source]

Compute p* so TVaR @ p* of min(X, b) = premium

In this case the cap b has an impact (think of integrating q(p) over p to 1, q is impacted by b)

premium <= b is required (no rip off condition)

If b < inf then must solve TVaR(p) - (1 - F(b)) / (1 - p)[TVaR(F(b)) - b] = premium Let k = (1 - F(b)) [TVaR(F(b)) - b], so solving

f(p) = TVaR(p) - k / (1 - p) - premium == 0

using NR

Parameters
  • line

  • premium – target premium

  • b – bound

Returns

ped_distortion(n, solver='rs')[source]

make the approximating distortion from the first n Principal Extreme Distortions (PED)s using rs or ip solutions

Parameters

n

Returns

principal_extreme_distortion_analysis(gs, pricing=False)[source]

Find the principal extreme distortion analysis to solve for gs = g(s), s=self.cloud_df.index

Assumes that tvar_cloud has been called and that cloud_df exists len(gs) = len(cloud_df)

E.g., call

b = Bounds(port) b.t_mode = ‘u’ # set premium and asset level a b.tvar_cloud(‘total’, premium, a) # make gs b.principal_extreme_distortion_analysis(gs)

Parameters

gs – either g(s) evaluated on s = cloud_df.index or the name of a calibrated distortion in

distribution_spec.dists (created by a call to calibrate_distortions) :param pricing: if try, try just using pricing distortions :return:

quick_price(distortion, a)[source]

price total to assets a using distortion

requires distribution_spec has a density_df dataframe with a p_total or p_total

TODO: add ability to price other lines :param distortion: :param a: :return:

property t_mode
tvar_array(line, n_tps=256, b=inf, kind='interp')[source]

Compute tvars at n equally spaced points, tps.

Parameters
  • line

  • n_tps – number of tvar p points, default 256

  • b – cap on losses applied before computing TVaRs (e.g., adjust losses for finite assets b). Use np.inf for unlimited losses.

  • kind – if interp uses the standard function, easy, for continuous distributions; if ‘tail’ uses explicit integration of tail values, for discrete distributions

Returns

tvar_cloud(line, premium, a, n_tps, s, kind='interp')[source]

weight down tvar functions to the extremal convex measures

asset level a acts like an agg stop on what is being priced, i.e. we are working with min(X, a)

Parameters
  • line

  • premium

  • a

  • n_tps

  • s

  • b – bound, applies to min(line, b)

Returns

property tvar_df
tvar_hinges(s)[source]

make the tvar hinge functions by evaluating each tvar_p(s) = min(1, s/(1-p) for p in tps, at EP points s

all arguments in [0,1] x [0,1]

Parameters

s

Returns

tvar_with_bound(p, b=inf, kind='interp')[source]

Compute tvar taking bound into account. Assumes tvar_function setup.

Warning: b must equal the b used when calibrated. The issue is computing F varies with the type of underlying portfolio. This is fragile. Added storing b and checking equal. For backwards comp. need to keep b argument

Parameters
  • p

  • b

Returns

weight_image(ax, levels=20, colorbar=True)[source]
aggregate.bounds.plot_lee(port, ax, c, lw=2)[source]

Lee diagram by hand

aggregate.bounds.plot_max_min(self, ax)[source]

Extracted from bounds, self=Bounds object

aggregate.bounds.similar_risks_example()[source]

Interesting beta risks and how to use similar_risks_sa.

Returns

aggregate.bounds.similar_risks_graphs_sa(axd, bounds, port, pnew, roe, prem)[source]

stand-alone ONLY WORKS FOR BOUNDED PORTFOLIOS (use for beta mixture examples) Updated version in CaseStudy axd from mosaic bounds = Bounds class from port (calibrated to some base)it pnew = new portfolio input new beta(a,b) portfolio, using existing bounds object

sample: see similar_risks_sample()

Provenance : from make_port in Examples_2022_post_publish