ElMaestro
★★★

Denmark,
2019-01-30 12:07
(2073 d 08:07 ago)

(edited by ElMaestro on 2019-01-30 13:56)
Posting: # 19826
Views: 5,128
 

 LSMeans, still causing me headaches [General Sta­tis­tics]

Hi all,

I am getting back to a good old topic, the LS Mean. I am asking this in very general terms without specific reference to typical BE model, and I really really hope we can get beyond "if there's balance then so-and-so...".


I still have pretty much no idea what an LSMean really is. I am aware that this SAS invention is described e.g. here and its calculation here.

But I am still quite unable to see what an LS Mean achieves. For example, "LS-means are predicted population margins; that is, they estimate the marginal means over a balanced population." ... what does that mean and why does this make the LS Mean more relevant than a basic model effect from the b vector of y=Xb+e?

Can someone with stats knowledge tell me:
  1. Generally, in which situations, in scientific (not regulatory) terms, are LSMeans relevant?
  2. Why (in which situations generally) is a least squares treatment effect (i.e. the coefficients in b)less relevant than an LSMean when the two may differ?
  3. On what basis would we say that the model residual "always" can be used to generate a CI for LS Mean differences?
In other words, can you convince me using arithmetolophystics and Al Jabra that the standard error of a treatment effect is the same as the standard error for an LSMean (for the given level of the factor of interest).

In perspective, and pardon this little provocation, Kinetica also calculated certain things on basis of an assumption of balance whether or not there truly was balance. And that was...well.... possibly not considered widely relevant or optimal. :-D

Please: We do not need to discuss balance and when LSMeans are equal to ordinary means or treatment effects in the b vector for a BE trial. While the two are tightly related this latter aspect is not per se what I am asking about.


Edit: Category changed; see also this post #1. [Mittyri]

Pass or fail!
ElMaestro
Obinoscopy
★    

USA,
2019-02-03 16:35
(2069 d 03:39 ago)

@ ElMaestro
Posting: # 19850
Views: 4,376
 

 LSMeans, still causing lots of people headaches

Hi ElMaestro,

I guess LSMeans still gives a lot of people headaches which explains why your thread has been avoided like a plague.

Well I am not an expert but lemme try and say what I know....my widow's might. I am responding to questions (1) and (3).

I think LSMeans is a more accurate estimate of the mean of Ln[T] and Ln[R] in crossover studies. This is because the other mean estimates do not take into cognizance of the fact that the var(Ln[T]) in the sequence TR is different from the var(Ln[T]) in the sequence RT. Same for Ln[R]. In effect, the two population (sequence TR and RT) are different.

I usually imagine a situation where a drug has a lower AUC in period II than period I (perhaps due to the action of antibodies that was developed during period I). In such case, it would not be advisable to find the means for Ln[T] and Ln[R] without accounting for the period effect. If not done, the mean Ln[T] might be underestimated when compared to mean Ln[R] if the study was not balanced (RT having more subjects than TR).

The Model Residual is the variance for the difference of LSMeans between Ln[T] and Ln[R]. Lemme do some little mathematical manipulations (I might be wrong though) to buttress my point:

Diff of LSM = {(Ln[T] in TR) + (Ln[T] in RT)}/2 - {(Ln[R] in TR) + (Ln[R] in RT)}/2
Diff of LSM = 1/2{(Ln[T] in TR) + (Ln[T] in RT) - (Ln[R] in TR) - (Ln[R] in RT)}
Diff of LSM = 1/2{(Ln[T] in TR) - (Ln[R] in TR) + (Ln[T] in RT) - (Ln[R] in RT)}
Diff of LSM = {(Ln[T] in TR) - (Ln[R] in TR)}/2 + {(Ln[T] in RT) - (Ln[R] in RT)}/2
Diff of LSM = Residuals in TR + Residuals in RT

Thus the Mean Sum of Squares for the Residuals is the variance for the difference of LSMeans. This is my arithmetolophystics and Al Jabra. I know there are lots of magic in it.

For your other questions, I guess the headaches remains until someone offers us some panadol or Tylenol.

Regards

Scopy
ElMaestro
★★★

Denmark,
2019-02-03 19:39
(2069 d 00:35 ago)

@ Obinoscopy
Posting: # 19851
Views: 4,360
 

 LSMeans, still causing lots of people headaches

Hellobi :-D,

and thanks for helping towards an understanding.

I am a bit confused, what is ln[T] in your terminology?
Is it individual measurements, or effects form the b-vector or is it the average on the ln scale?
And what is the starting point for this?

Also, residuals -once a linear model has been fit to the least squares criterion- sum to zero, right? I got confused when you went from step 4 to 5.


❝ The Model Residual is the variance for the difference of LSMeans between Ln[T] and Ln[R].


Why?
I would think it is the plain uncertainty for an (within-) effect estimate in the b vector when we talk crossovers. Multiply by two and you have the variance of the difference or something like that.

Pass or fail!
ElMaestro
Obinoscopy
★    

USA,
2019-02-03 20:43
(2068 d 23:31 ago)

@ ElMaestro
Posting: # 19852
Views: 4,381
 

 LSMeans, still causing lots of people headaches

❝ Hellobi :-D,


❝ thanks for helping towards an understanding.


No harm in trying :-D

❝ I am a bit confused, what is ln[T] in your terminology?

❝ Is it individual measurements, or effects form the b-vector or is it the average on the ln scale?

❝ And what is the starting point for this?


It's the individual measurements.

❝ Also, residuals -once a linear model has been fit to the least squares criterion- sum to zero, right? I got confused when you went from step 4 to 5.


Yeah, that's true, the residuals should sum up to zero. The variance will be the sum of squares of the residuals, in this case, will be greater than zero. I was just trying to show the relationship between the residuals and the difference of the lsmeans. The maths isn't perfect one bit.

❝ ❝ The Model Residual is the variance for the difference of LSMeans between Ln[T] and Ln[R].


❝ Why?

❝ I would think it is the plain uncertainty for an (within-) effect estimate in the b vector when we talk crossovers. Multiply by two and you have the variance of the difference or something like that.


Oh yes! Mutiplying it by two gives the variance of the difference of LSMeans. Hope I have not worsened the headache with my statistically incoherent post :-D.

Regards,

Scopy
UA Flag
Activity
 Admin contact
23,240 posts in 4,884 threads, 1,655 registered users;
70 visitors (0 registered, 70 guests [including 9 identified bots]).
Forum time: 21:14 CEST (Europe/Vienna)

The epistemological value of probability theory is based on the fact
that chance phenomena, considered collectively and on a grand scale,
create non-random regularity.    Andrey Kolmogorov

The Bioequivalence and Bioavailability Forum is hosted by
BEBAC Ing. Helmut Schütz
HTML5