The difference between glm and LogitModelFit

I have a problem with the glm function in R.

In particular, I'm not sure how to include nominal variables .

The results that I get in R after running the glm function are as follows:

> df

   x1 x2 y
1  a  2  0
2  b  4  1
3  a  4  0
4  b  2  1
5  a  4  1
6  b  2  0

> str(df)
'data.frame':   6 obs. of  3 variables:
 $ x1: Factor w/ 2 levels "a","b": 1 2 1 2 1 2
 $ x2: num  2 4 4 2 4 2
 $ y: Factor w/ 2 levels "0","1": 1 2 1 2 2 1

Call:
glm(formula = y ~ x1 + x2, family = "binomial", data = df)

Coefficients:
             Estimate Std. Error z value Pr(>|z|)
(Intercept)   -39.132  15208.471  -0.003    0.998
x1b            19.566   7604.236   0.003    0.998
x2              9.783   3802.118   0.003    0.998

However, when I run the LogitModelFit function in Wolfram Mathematica , I get different parameters.

The code in Wolfram is below:

data = {{a, 2, 0}, {b, 4, 1}, {a, 4, 0}, {b, 2, 1}, {a, 4, 1}, {b, 2, 0}};

model = LogitModelFit[data, {x, y}, {x, y}, NominalVariables -> x]

model["BestFitParameters"]

And these are my evaluation parameters:

{-18.5661, -18.5661, 9.28303}

model // Normal

1/(1 + E^(18.5661 - 9.28303 y + 18.5661 DiscreteIndicator[x, a, {a, b}]))

So what is different here? Why are the results so different?

Am I doing something wrong in R or in Tungsten?

+6
source share
2 answers

4 , 3 :

library(dplyr)
df %>% group_by(x1, x2) %>% summarise(n = n(), y = mean(y))

, . ( ).

-, , , , x1:

> df$x1 <- relevel(df$x1, "b")
> m <- glm(y ~ x1 + x2, family = binomial(), data = df, control = list(maxit = 100))
> summary(m)

Call:
glm(formula = y ~ x1 + x2, family = binomial(), data = df, control = list(maxit = 100))

Deviance Residuals: 
       1         2         3         4         5         6  
-0.00008   0.00008  -1.17741   1.17741   1.17741  -1.17741  

Coefficients:
            Estimate Std. Error z value Pr(>|z|)
(Intercept)  -19.566   7604.236  -0.003    0.998
x1a          -19.566   7604.236  -0.003    0.998
x2             9.783   3802.118   0.003    0.998

(Dispersion parameter for binomial family taken to be 1)

    Null deviance: 8.3178  on 5  degrees of freedom
Residual deviance: 5.5452  on 3  degrees of freedom
AIC: 11.545

Number of Fisher Scoring iterations: 18

( , , ).

(glm wolfram) . ( -Inf), , (9.783 * 2 = 19.566), .

, x2 2 4, .

+4

LogitModelFit

1/(1 + E^(18.5661 - 9.28303 y + 18.5661 DiscreteIndicator[x, a, {a, b}]))

DiscreteIndicator x1 == 'a',

glm x1b x1 == 'b':

> str(df)
'data.frame':   6 obs. of  3 variables:
 $ x1: Factor w/ 2 levels "a","b": 1 2 1 2 1 2
 $ x2: num  2 4 4 2 4 2
 $ y: Factor w/ 2 levels "0","1": 1 2 1 2 2 1

Call:
glm(formula = y ~ x1 + x2, family = "binomial", data = df)

Coefficients:
             Estimate Std. Error z value Pr(>|z|)
(Intercept)   -39.132  15208.471  -0.003    0.998
x1b            19.566   7604.236   0.003    0.998
x2              9.783   3802.118   0.003    0.998

, , -, ​​, LogitModelFit glm . LogitModelFit x=='a', glm x=='b'.

+3

All Articles