ML vs. REML [🇷 for BE/BA]
❝ There is still something that really bothers me and that is the log Likelihood difference of R versus WNL. With REML I think optimisation switches back and forth be in terms of the estimates. […]
Maybe, maybe not. PHX’ manual only tells me:
The linear mixed effects model is:
y = Xβ + Zγ + ε,
V = Variance(y) =ZGZT + R.
Let θ be a vector consisting of the variance parameters in G and R. The full maximum likelihood procedure (ML) would simultaneously estimate both the fixed effects parameters β and the variance parameters θ by maximizing the likelihood of the observations y with respect to these parameters. In contrast, restricted maximum likelihood estimation (REML) maximizes a likelihood that is only a function of the variance parameters θ and the observations y, and not a function of the fixed effects parameters. Hence for models that do not contain any fixed effects, REML would be the same as ML.
❝ ML would only be plain and simple covariance matrix fiddling, I think (?).
Well, we do have fixed effects, right? At least in PHX/WNL, only REML is implemented. IIRC, REML is recommended by Patterson/Jones somewhere.
Thread closed. Please continue over there.
Dif-tor heh smusma 🖖🏼 Довге життя Україна!
![[image]](https://static.bebac.at/pics/Blue_and_yellow_ribbon_UA.png)
Helmut Schütz
![[image]](https://static.bebac.at/img/CC by.png)
The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes

Complete thread:
- Reproducing Bear results in Phoenix mittyri 2015-04-14 08:35 [🇷 for BE/BA]
- Bug in bear? Helmut 2015-04-14 14:07
- Bug in bear? mittyri 2015-04-14 20:43
- Bug in bear? Helmut 2015-04-15 02:17
- use the same dataset? yjlee168 2015-04-20 10:13
- Bug in bear? mittyri 2015-04-14 20:43
- Bugs in bear with replicated demo data set? yjlee168 2015-04-15 09:38
- Bugs fixed in bear? yjlee168 2015-04-17 01:21
- Bugs fixed in bear? Helmut 2015-04-17 14:25
- Bugs fixed in bear? yjlee168 2015-04-17 18:50
- Bugs fixed in bear? ElMaestro 2015-04-17 23:14
- lme() in bear yjlee168 2015-04-18 11:12
- lme() in bear ElMaestro 2015-04-18 11:28
- lme() in bear yjlee168 2015-04-18 12:02
- lme() in bear ElMaestro 2015-04-18 13:42
- Mean means Helmut 2015-04-18 14:08
- Mean means ElMaestro 2015-04-18 16:01
- Mean means Helmut 2015-04-19 00:59
- Mean means ElMaestro 2015-04-19 09:14
- LL and AIC Helmut 2015-04-19 11:08
- Confused ElMaestro 2015-04-19 11:42
- LL and AIC Helmut 2015-04-19 11:08
- lsmeans for mixed model in R yjlee168 2015-04-19 23:36
- lsmeans() & lme() Helmut 2015-04-20 01:28
- lsmeans() & lme() yjlee168 2015-04-20 10:23
- lsmeans() & lme() ElMaestro 2015-04-20 10:35
- keep it simple! Helmut 2015-04-20 14:34
- keep it simple! ElMaestro 2015-04-20 15:39
- ML vs. REMLHelmut 2015-04-20 16:35
- keep it simple! ElMaestro 2015-04-20 15:39
- keep it simple! Helmut 2015-04-20 14:34
- lsmeans() & lme() Helmut 2015-04-20 01:28
- Mean means ElMaestro 2015-04-19 09:14
- Mean means Helmut 2015-04-19 00:59
- Mean means ElMaestro 2015-04-18 16:01
- Mean means Helmut 2015-04-18 14:08
- lme() in bear ElMaestro 2015-04-18 13:42
- lme() in bear yjlee168 2015-04-18 12:02
- lme() with NA in R yjlee168 2015-04-19 23:44
- lme() in bear ElMaestro 2015-04-18 11:28
- lme() in bear yjlee168 2015-04-18 11:12
- Dataset Helmut 2015-04-18 13:30
- Dataset and lme() in bear yjlee168 2015-04-18 23:14
- Bugs fixed in bear? ElMaestro 2015-04-17 23:14
- Bugs fixed in bear? yjlee168 2015-04-17 18:50
- Bugs fixed in bear? Helmut 2015-04-17 14:25
- Bug in bear? Helmut 2015-04-14 14:07