Processing math: 100%
Venter, Gary G. 2021. “Loss Reserving Using Estimation Methods Designed for Error Reduction.” Variance 14 (1).
Download all (10)
  • Figure 1. Parameter ranges, Stan gamma fit v = (a2,a6,b2,b3,b4,b5,b7)
  • Figure 2. Row and column parameters for gamma in Stan and full regression, lognormal
  • Figure 3. Fitted CV and skewness for gamma and Weibull k fits
  • Figure 4. Gamma severity parameter ranges, v = (a2,a4,b2,b3,b4,b5,b7)
  • Figure 5. Row and column severity level parameters
  • Figure 6. Frequency row and column factors
  • Figure 7. Columns 1–10 parameter ranges for exposure log slope change variables
  • Figure 8. Factors
  • Figure 9. Student’s t with 6 degrees of freedom versus Laplace densities
  • Figure 10. LASSO parameter growth with shrinkage reducing


Maximum likelihood estimation has been the workhorse of statistics for decades, but alternative methods, going under the name “regularization,” are proving to have lower predictive variance. Regularization shrinks fitted values toward the overall mean, much like credibility does. There is good software available for regularization, and in particular, packages for Bayesian regularization make it easy to fit more complex models. One example given is a combined additive-multiplicative reserve model. In addition, probability distributions not available in generalized linear models are tried for residuals. These can improve range estimates. By applying heteroscedasticity adjustments to standard distributions, the variance-mean relationship as well as skewness and similar properties are explored. Use of software packages is discussed, with sample coding and output. The focus is on methodology, so projection to fill out the triangle is not addressed, but this is usually straightforward.

Accepted: March 12, 2019 EDT