Skip to content
Merged

#717 #770

Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
44 commits
Select commit Hold shift + click to select a range
f65af55
REFACTOR: Reorganize type hierarchy of sparse.py. Move array-level fu…
genedan May 6, 2026
a33f181
FIX: Apply bugbot fixes.
genedan May 6, 2026
41bf857
FIX: Apply bugbot fixes.
genedan May 6, 2026
e48a28e
FIX: Apply bugbot fixes.
genedan May 6, 2026
0715793
TEST: Add unit tests to chainladder.utils.sparse to cover missing lines.
genedan May 7, 2026
8bddc35
Expanded apriori docstring
kennethshsu May 7, 2026
022d140
Improved docstring and added examples
kennethshsu May 7, 2026
625def8
added code for doc tests
kennethshsu May 7, 2026
fb80133
Added a new checkbox to remind PR to run tests
kennethshsu May 7, 2026
49daeed
better command and surpressed some warnings
kennethshsu May 7, 2026
d8b6f02
Added example, removed reference to chainladder
kennethshsu May 7, 2026
39b25f7
doctstrings with examples
kennethshsu May 7, 2026
35de8fc
Added covergence to the n_iters parameter.
kennethshsu May 7, 2026
66ce8a5
docstring improvement and examples
kennethshsu May 7, 2026
e068267
Added expectedloss method
kennethshsu May 7, 2026
39d33b1
Improved docstrings and examples
kennethshsu May 7, 2026
0d15ebc
added load_sample
kennethshsu May 8, 2026
91b4721
Build(deps): Bump mistune from 3.1.4 to 3.2.1
dependabot[bot] May 8, 2026
574c02a
Merge pull request #748 from casact/dependabot/uv/mistune-3.2.1
genedan May 8, 2026
ef6d24f
Improved docstring
kennethshsu May 8, 2026
b127760
Clarification
kennethshsu May 8, 2026
8c6e285
Update pyproject.toml
wendy-w2029 May 8, 2026
9c33931
Updated the example in fit
kennethshsu May 8, 2026
4c21eb3
v.0.9.2 release notes
kennethshsu May 8, 2026
09f799d
Merge pull request #750 from wendy-w2029/add-numpy-dependency
kennethshsu May 8, 2026
9b0dac7
Added a last minute PR
kennethshsu May 8, 2026
4dd89ec
Added more releaes notes from older versions that were not included i…
kennethshsu May 8, 2026
d1f2a25
Bumping pandas to 2.3.3 in preparation of pandas 3.0
kennethshsu May 9, 2026
0dae8fa
Merge pull request #747 from casact/#704
kennethshsu May 9, 2026
d318b33
Merge branch 'main' of https://github.com/casact/chainladder-python i…
kennethshsu May 9, 2026
0ed707a
Small typo, thanks AF!
kennethshsu May 9, 2026
70a7be1
One more approved PR (docstrings improvement)
kennethshsu May 9, 2026
0f9bf4e
bumping pandas
kennethshsu May 9, 2026
ce3ed47
Fix Adjustments API page linking to chainladder.workflow (#762)
SaguaroDev May 10, 2026
f7ee445
Annotate matplotlib dependency as required for TriangleDisplay.heatma…
SaguaroDev May 11, 2026
51fb041
Merge branch 'main' of https://github.com/casact/chainladder-python i…
kennethshsu May 11, 2026
c5ab6f4
Merge pull request #739 from casact/#737-sparse-hierarchy
genedan May 11, 2026
02d1480
Build(deps): Bump urllib3 from 2.6.3 to 2.7.0
dependabot[bot] May 11, 2026
4a611e6
Merge pull request #767 from casact/dependabot/uv/urllib3-2.7.0
kennethshsu May 11, 2026
482508b
Merge branch 'main' of https://github.com/casact/chainladder-python i…
kennethshsu May 11, 2026
9df6a5e
Updated date and some final merged PRs
kennethshsu May 11, 2026
2f6ea12
Merge pull request #751 from casact/0.9.2-release-prep
kennethshsu May 11, 2026
a627a2c
Add more installation options
kennethshsu May 11, 2026
c2c748c
Added uv run in the pytest
kennethshsu May 11, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .github/pull_request_template.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,3 +6,5 @@

## Additional Context for Reviewers


- [ ] I passed tests locally for both code (`uv run pytest`) and documentation changes (`uv run jb build docs --builder=custom --custom-builder=doctest`)
2 changes: 1 addition & 1 deletion chainladder/core/dunders.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ def _validate_arithmetic(self, other: Any) -> tuple:
if isinstance(other, np.ndarray) and self.array_backend != 'numpy':
obj = self.copy()
other = obj.get_array_module().array(other)
elif isinstance(other, sp) and self.array_backend != 'sparse':
elif isinstance(other, sp.COO) and self.array_backend != 'sparse':
obj = self.set_backend('sparse')
else:
obj = self.copy()
Expand Down
6 changes: 3 additions & 3 deletions chainladder/core/slice.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ def _sparse_setitem(self, key, values):
(self.obj.values.coords, np.array(key)[:, None]), 1)
self.obj.values.data = np.concatenate(
(self.obj.values.data, np.array([values])), 0)
self.obj.values = self.obj.get_array_module()(
self.obj.values = self.obj.get_array_module().COO(
self.obj.values.coords, self.obj.values.data, prune=True,
has_duplicates=False, shape=self.obj.shape,
fill_value=self.obj.values.fill_value)
Expand Down Expand Up @@ -238,8 +238,8 @@ def __setitem__(self, key, value):
value.values.coords[1] = i
coords = np.concatenate((before.coords, value.values.coords), axis=1)
data = np.concatenate((before.data, value.values.data))
self.values = xp(
coords, data, shape=self.shape, prune=True, fill_value=xp.nan
self.values = xp.COO(
coords, data, shape=self.shape, prune=True, fill_value=xp.COO.nan
)
else:
if isinstance(value, TriangleSlicer):
Expand Down
2 changes: 1 addition & 1 deletion chainladder/core/triangle.py
Original file line number Diff line number Diff line change
Expand Up @@ -524,7 +524,7 @@ def __init__(

# Construct Sparse multidimensional array.
self.values: COO = num_to_nan(
sp(
sp.COO(
coords,
amts,
prune=True,
Expand Down
121 changes: 85 additions & 36 deletions chainladder/methods/benktander.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ class Benktander(MethodBase):
then use 1.0
n_iters: int, optional (default=1)
Number of iterations to use in the Benktander model.
When n_iters=1, the result is equivalent to the BornhuetterFerguson method.
When n_iters>>1, the result converges to the traditional Chainladder model.
apriori_sigma: float, optional (default=0.0)
Standard deviation of the apriori. When used in conjunction with the
bootstrap model, the model samples aprioris from a lognormal distribution
Expand Down Expand Up @@ -49,53 +51,83 @@ class Benktander(MethodBase):

.. testcode::

tr = cl.load_sample('ukmotor')
apriori = cl.Chainladder().fit(tr).ultimate_ * 0 + 14000
xyz = cl.load_sample("xyz")

With ``n_iters=1`` Benktander reproduces Bornhuetter-Ferguson exactly.
ibnr = cl.Benktander().fit(X=xyz["Paid"], sample_weight=xyz["Premium"].latest_diagonal).ibnr_
print(ibnr)

.. testoutput::

2261
1998 NaN
1999 115.472127
2000 914.033812
2001 2432.394513
2002 6037.026677
2003 13928.934651
2004 33925.451475
2005 69724.761575
2006 73410.593920
2007 52977.560411
2008 45873.769490

When `n_iters=1`, the model is exactly the same as the BornhuetterFerguson model.

.. testcode::

print(
cl.Benktander(apriori=1.0, n_iters=1).fit(
tr, sample_weight=apriori
).ultimate_
xyz = cl.load_sample("xyz")

bk_ibnr = (
cl.Benktander(n_iters=1)
.fit(X=xyz["Paid"], sample_weight=xyz["Premium"].latest_diagonal)
.ibnr_
)
bf_ibnr = (
cl.BornhuetterFerguson()
.fit(X=xyz["Paid"], sample_weight=xyz["Premium"].latest_diagonal)
.ibnr_
)
print(bk_ibnr - bf_ibnr)

.. testoutput::

2261
2007 12690.000000
2008 13121.098503
2009 14028.278620
2010 13272.048822
2011 13911.968891
2012 15614.145287
2013 16029.501746

Increasing ``n_iters`` pulls the immature origins toward the chainladder
estimate. The 2013 origin shows this most: ``16029`` at ``n_iters=1``,
rising to ``19110`` at ``n_iters=4`` and approaching the chainladder
ultimate of ``20680``.
2261
1998 NaN
1999 NaN
2000 NaN
2001 NaN
2002 NaN
2003 NaN
2004 NaN
2005 NaN
2006 NaN
2007 NaN
2008 NaN

When `n_iters>>1`, the model converges to the traditional Chainladder model.

.. testcode::

print(
cl.Benktander(apriori=1.0, n_iters=4).fit(
tr, sample_weight=apriori
).ultimate_
)
xyz = cl.load_sample("xyz")

bk_ibnr = cl.Benktander(n_iters=1000).fit(X=xyz["Paid"], sample_weight=xyz["Premium"].latest_diagonal).ibnr_
cl_ibnr = cl.Chainladder().fit(xyz["Paid"]).ibnr_
print(bk_ibnr - cl_ibnr)

.. testoutput::

2261
2007 12690.000000
2008 13096.902490
2009 14030.535854
2010 13138.365841
2011 13880.984774
2012 16719.527550
2013 19110.806503
1998 NaN
1999 NaN
2000 NaN
2001 1.455192e-11
2002 -7.275958e-12
2003 7.275958e-12
2004 1.455192e-11
2005 -1.455192e-11
2006 2.910383e-11
2007 -5.820766e-11
2008 -7.275958e-11
"""

def __init__(self, apriori=1.0, n_iters=1, apriori_sigma=0, random_state=None):
Expand Down Expand Up @@ -132,13 +164,30 @@ def fit(self, X, y=None, sample_weight=None):

.. testcode::

tr = cl.load_sample('ukmotor')
apriori = cl.Chainladder().fit(tr).ultimate_ * 0 + 14000
print(cl.Benktander(apriori=1.0, n_iters=2).fit(tr, sample_weight=apriori))
xyz = cl.load_sample("xyz")

ultimate = (
cl.Benktander(apriori=1, n_iters=2)
.fit(X=xyz["Paid"], sample_weight=xyz["Premium"].latest_diagonal)
.ultimate_
)
print(ultimate)

.. testoutput::

Benktander(n_iters=2)
2261
1998 15822.000000
1999 24908.397003
2000 37547.676656
2001 40511.198946
2002 49417.354765
2003 50042.095135
2004 82437.601111
2005 95417.171135
2006 88485.508416
2007 66882.788227
2008 50708.755370

"""
if sample_weight is None:
raise ValueError("sample_weight is required.")
Expand Down
74 changes: 42 additions & 32 deletions chainladder/methods/bornferg.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,14 @@ class BornhuetterFerguson(Benktander):
Parameters
----------
apriori: float, optional (default=1.0)
Multiplier for the sample_weight used in the Bornhuetter Ferguson
method. If sample_weight is already an apriori measure of ultimate,
then use 1.0
Multiplier for the `sample_weight` used in the Bornhuetter Ferguson
method. If `sample_weight` is already an apriori measure of ultimate,
then use 1.0.
The recommended pratice is to seperate the model parameter assumption
and data apart.
For example, if the apriori s 80% of premium, it is recommended to set
the aprior as 0.8 and leave the premium data in `sample_weight` argument
unmodified.
apriori_sigma: float, optional (default=0.0)
Standard deviation of the apriori. When used in conjunction with the
bootstrap model, the model samples aprioris from a lognormal distribution
Expand All @@ -35,55 +40,60 @@ class BornhuetterFerguson(Benktander):
Examples
--------
Bornhuetter-Ferguson requires an apriori expected ultimate per origin,
supplied through ``sample_weight``. ``sample_weight`` must be a
chainladder Triangle aligned with ``X``, not a scalar; passing
``sample_weight=14000`` would raise ``AttributeError`` because the model
accesses ``.shape``.
supplied through ``sample_weight``.

A common idiom for building a flat per-origin apriori is to take any
same-shape Triangle, zero it out, and add the desired value. Below uses
the chainladder ultimate as the shape donor.
same-shape Triangle, zero it out, and add the desired value. Here is an example.

.. testsetup::

import chainladder as cl

.. testcode::

tr = cl.load_sample('ukmotor')
cl_ult = cl.Chainladder().fit(tr).ultimate_
apriori = cl_ult * 0 + float(cl_ult.sum()) / 7
print(apriori)
raa = cl.load_sample("raa")
premium = raa.latest_diagonal * 0 + 40_000 # zero out and add 40,000 to each origin

ibnr = cl.BornhuetterFerguson(apriori=0.7).fit(X=raa, sample_weight=premium).ibnr_
print(ibnr)

.. testoutput::

2261
2007 14903.967562
2008 14903.967562
2009 14903.967562
2010 14903.967562
2011 14903.967562
2012 14903.967562
2013 14903.967562

Fit with that apriori. The BF ultimates pull the immature origins toward
the apriori while leaving mature origins close to chainladder.
1981 NaN
1982 255.707763
1983 717.772687
1984 1596.061515
1985 2658.738155
1986 5239.441491
1987 8574.335344
1988 12714.889984
1989 18585.219714
1990 24861.068855

One might be tempted to set never set the aprior and modify the sample_weight directly, and they will result in the same answer, but this is not the recommended practice. It not only add confusion, but it alos mixes the model parameter assumption and data together.

.. testcode::

model = cl.BornhuetterFerguson(apriori=1.0).fit(tr, sample_weight=apriori)
print(model.ultimate_)
raa = cl.load_sample("raa")
premium = raa.latest_diagonal * 0 + 40_000 * 0.7 # premium is modified by 70%

ibnr = cl.BornhuetterFerguson().fit(X=raa, sample_weight=premium).ibnr_
print(ibnr)

.. testoutput::

2261
2007 12690.000000
2008 13145.318280
2009 14095.125641
2010 13412.748068
2011 14150.549749
2012 15999.244850
2013 16658.824705
1981 NaN
1982 255.707763
1983 717.772687
1984 1596.061515
1985 2658.738155
1986 5239.441491
1987 8574.335344
1988 12714.889984
1989 18585.219714
1990 24861.068855
"""

def __init__(self, apriori=1.0, apriori_sigma=0.0, random_state=None):
Expand Down
Loading
Loading