statsmodels.gam.generalized_additive_model.GLMGam

class statsmodels.gam.generalized_additive_model.GLMGam(endog, exog=None, smoother=None, alpha=0, family=None, offset=None, exposure=None, missing='none', **kwargs)[source]

Model class for generalized additive models, GAM.

This inherits from GLM.

Warning: Not all inherited methods might take correctly account of the penalization. Not all options including offset and exposure have been verified yet.

Parameters:

endog : array_like

exog : array_like or None

This explanatory variables are treated as linear. The model in this case is a partial linear model.

smoother : instance of additive smoother class such as Bsplines or

CyclicCubicSplines This is a required keyword argument

alpha : list of floats

penalization weights for smooth terms. The length of the list needs to be the same as the number of smooth terms in the smoother

family : instance of GLM family

see GLM

offset : None or array_like

see GLM

exposure : None or array_like

see GLM

missing : ‘none’

missing value handling is not supported in this class

kwargs :

extra keywords are used in call to the super classes.

Notes

Status: experimental. This has full unit test coverage for the core results with Gaussian and Poisson (without offset and exposure). Other options and additional results might not be correctly supported yet. (Binomial with counts, i.e. with n_trials, is most likely wrong in pirls. User specified var or freq weights are most likely also not correct for all results.)

Attributes

endog_names Names of endogenous variables
exog_names Names of exogenous variables

Methods

estimate_scale(mu) Estimates the dispersion/scale.
estimate_tweedie_power(mu[, method, low, high]) Tweedie specific function to estimate scale and the variance parameter.
fit([start_params, maxiter, method, tol, …]) estimate parameters and create instance of GLMGamResults class
fit_constrained(constraints[, start_params]) fit the model subject to linear equality constraints
fit_regularized([method, alpha, …]) Return a regularized fit to a linear regression model.
from_formula(formula, data[, subset, drop_cols]) Create a Model from a formula and dataframe.
get_distribution(params[, scale, exog, …]) Returns a random number generator for the predictive distribution.
hessian(params[, pen_weight]) Hessian of model at params
hessian_factor(params[, scale, observed]) Weights for calculating Hessian
hessian_numdiff(params[, pen_weight]) hessian based on finite difference derivative
information(params[, scale]) Fisher information matrix.
initialize() Initialize a generalized linear model.
loglike(params[, pen_weight]) Log-likelihodo of model at params
loglike_mu(mu[, scale]) Evaluate the log-likelihood for a generalized linear model.
loglikeobs(params[, pen_weight]) Log-likelihood of model observations at params
predict(params[, exog, exposure, offset, linear]) Return predicted values for a design matrix
score(params[, pen_weight]) Gradient of model at params
score_factor(params[, scale]) weights for score for each observation
score_numdiff(params[, pen_weight, method]) score based on finite difference derivative
score_obs(params[, pen_weight]) Gradient of model observations at params
score_test(params_constrained[, …]) score test for restrictions or for omitted variables
select_penweight([criterion, start_params, …]) find alpha by minimizing results criterion
select_penweight_kfold([alphas, …]) find alphas by k-fold cross-validation