Posted on Categories Coding, data science, Practical Data Science, Programming, Statistics, TutorialsTags , , 10 Comments on Trimming the Fat from glm() Models in R

Trimming the Fat from glm() Models in R

One of the attractive aspects of logistic regression models (and linear models in general) is their compactness: the size of the model grows in the number of coefficients, not in the size of the training data. With R, though, glm models are not so concise; we noticed this to our dismay when we tried to automate fitting a moderate number of models (about 500 models, with on the order of 50 coefficients) to data sets of moderate size (several tens of thousands of rows). A workspace save of the models alone was in the tens of gigabytes! How is this possible? We decided to find out.

As many R users know (but often forget), a glm model object carries a copy of its training data by default. You can use the settings y=FALSE and model=FALSE to turn this off.

set.seed(2325235)


# Set up a synthetic classification problem of a given size
# and two variables: one numeric, one categorical
# (two levels).
synthFrame = function(nrows) {
   d = data.frame(xN=rnorm(nrows),
      xC=sample(c('a','b'),size=nrows,replace=TRUE))
   d$y = (d$xN + ifelse(d$xC=='a',0.2,-0.2) + rnorm(nrows))>0.5
   d
}


# first show that model=F and y=F help reduce model size

dTrain = synthFrame(1000)
model1 = glm(y~xN+xC,data=dTrain,family=binomial(link='logit'))
model2 = glm(y~xN+xC,data=dTrain,family=binomial(link='logit'),
             y=FALSE)
model3 = glm(y~xN+xC,data=dTrain,family=binomial(link='logit'),
              y=FALSE, model=FALSE)

#
# Estimate the object's size as the size of its serialization
#
length(serialize(model1, NULL))
# [1] 225251
length(serialize(model2, NULL))
# [1] 206341
length(serialize(model3, NULL))
# [1] 189562

dTest = synthFrame(100)
p1 = predict(model1, newdata=dTest, type='response')
p2 = predict(model2, newdata=dTest, type='response')
p3 = predict(model3, newdata=dTest, type='response')
sum(abs(p1-p2))
# [1] 0
sum(abs(p1-p3))
# [1] 0

Continue reading Trimming the Fat from glm() Models in R

Posted on Categories data science, Practical Data Science, Pragmatic Data Science, Pragmatic Machine Learning, Statistics, TutorialsTags , , , , 1 Comment on Generalized linear models for predicting rates

Generalized linear models for predicting rates

I often need to build a predictive model that estimates rates. The example of our age is: ad click through rates (how often a viewer clicks on an ad estimated as a function of the features of the ad and the viewer). Another timely example is estimating default rates of mortgages or credit cards. You could try linear regression, but specialized tools often do much better. For rate problems involving estimating probabilities and frequencies we recommend logistic regression. For non-frequency (and non-categorical) rate problems (such as forecasting yield or purity) we suggest beta regression.

In this note we will work a toy problem and suggest some relevant R analysis libraries. Continue reading Generalized linear models for predicting rates

Posted on Categories StatisticsTags , , , , , , 3 Comments on A pathological glm() problem that doesn’t issue a warning

A pathological glm() problem that doesn’t issue a warning

I know I have already written a lot about technicalities in logistic regression (see for example: How robust is logistic regression? and Newton-Raphson can compute an average). But I just ran into a simple case where R‘s glm() implementation of logistic regression seems to fail without issuing a warning message. Yes the data is a bit pathological, but one would hope for a diagnostic or warning message from the fitter. Continue reading A pathological glm() problem that doesn’t issue a warning

Posted on Categories data science, Expository Writing, Mathematics, Statistics, TutorialsTags , , , , , , 6 Comments on How robust is logistic regression?

How robust is logistic regression?

Logistic Regression is a popular and effective technique for modeling categorical outcomes as a function of both continuous and categorical variables. The question is: how robust is it? Or: how robust are the common implementations? (note: we are using robust in a more standard English sense of performs well for all inputs, not in the technical statistical sense of immune to deviations from assumptions or outliers.)

Even a detailed reference such as “Categorical Data Analysis” (Alan Agresti, Wiley, 1990) leaves off with an empirical observation: “the convergence … for the Newton-Raphson method is usually fast” (chapter 4, section 4.7.3, page 117). This is a book that if there is a known proof that the estimation step is a contraction (one very strong guarantee of convergence) you would expect to see the proof reproduced. I always suspected there was some kind of Brouwer fixed-point theorem based folk-theorem proving absolute convergence of the Newton-Raphson method in for the special case of logistic regression. This can not be the case as the Newton-Raphson method can diverge even on trivial full-rank well-posed logistic regression problems. Continue reading How robust is logistic regression?

Posted on Categories data science, Expository Writing, Pragmatic Machine Learning, Statistics, TutorialsTags , , , , , , 1 Comment on What does a generalized linear model do?

What does a generalized linear model do?

What does a generalized linear model do? R supplies a modeling function called glm() that fits generalized linear models (abbreviated as GLMs). A natural question is what does it do and what problem is it solving for you? We work some examples and place generalized linear models in context with other techniques. Continue reading What does a generalized linear model do?