Helmut
★★★
avatar
Homepage
Vienna, Austria,
2014-12-23 16:32
(3383 d 02:29 ago)

Posting: # 14150
Views: 42,885
 

 S×F vari­­ance [General Sta­tis­tics]

Dear all – or shall I rather say Angus & John?

Since the other thread (due to indenting of posts) becomes increasingly difficult to read I decided to close it. Please continue here.

Regrettably I had too little spare time to dive into. However, some points:
  1. #14138: John is correct that the setup should essentially follow FDA’s code for RSABE. Therefore, anything obtained from ABE is not useful.
  2. #14141: I developed the RSABE-workflow (see here) in the first place. We presented a poster at the AAPS Meeting in 2013 (download). Ana’s new templates removed the clumsy workaround for joining values of the χ², since chiinv(p, df) was introduced in PHX/WNL6.4 and can be used in the custom transformation (RSABE|Prepare Data for RSABE analysis|s2wr and dfd|Step 4).
    @John: Note that these templates will not work in PHX/WNL6.3.
  3. #14146: The template is for RSABE, therefore, variability of T is not included (which John cor­rectly pointed out). Yes, it’s doable to add the coding for T.
  4. #14147: The template covers the partial replicate as well. Still to do for the next version: A setup for the fully replicated two-sequence three-period (TRT|RTR) design which would avoid the con­ver­gence issues seen in ABE sometimes. BTW, I’ll submit a protocol with such a design to FDA’s OGD next month for review.
  5. #14149: Modifying the ABE-part doesn’t do the job (see #1 above). What you could do: Copy the entire Prepare Data for RSABE analysis workflow and change the coding in the second one from R to T. Don’t forget to change all filters and transformations. Then you have both the results for R and T. Proceed from there.

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
AngusMcLean
★★  

USA,
2014-12-23 17:32
(3383 d 01:28 ago)

@ Helmut
Posting: # 14151
Views: 40,749
 

 S×F vari­­ance

Thanks Helmut:

I understand immediately below very well

❝ Regrettably I had too little spare time to dive into. However, some points:

❝ 1. #14138: John is correct that the setup should essentially follow FDA’s code for RSABE. Therefore, anything obtained from ABE is not useful.


But I cannot reconcile it with immediately below, since this pertains to the average BE part of the template so why do it if you are focusing on RSABE?

❝ Modifying the ABE-part doesn’t do the job (see #1 above). What you could do: Copy the entire Prepare Data for RSABE analysis workflow and change the coding in the second one from R to T. Don’t forget to change all filters and transformations. Then you have both the results for R and T. Proceed from there.


Also I had planned on doing the three chi-squared calculations in NCSS described in Part 4 of the new Methylphendiate Guidance (2014). I am puzzled by how you can do this in Phoenix (see below)

❝ Ana’s new templates removed the clumsy workaround for joining values of the χ², since chiinv(p, df) was introduced in PHX/WNL6.4 and can be used in the custom transformation.


Angus
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2014-12-23 18:17
(3383 d 00:44 ago)

@ AngusMcLean
Posting: # 14152
Views: 40,699
 

 S×F vari­­ance

Hi Angus,

❝ But I cannot reconcile it with immediately below, since this pertains to the average BE part of the template so why do it if you are focusing on RSABE?


I meant the sub-WF of the RSABE-WF. I stated “Prepare Data for RSABE analysis”, right?

❝ Also I had planned on doing the three chi-squared calculations in NCSS described in Part 4 of the new Methylphendiate Guidance (2014). I am puzzled by how you can do this in Phoenix (see below)


❝ ❝ Ana’s new templates removed the clumsy workaround for joining values of the χ², since chiinv(p, df) was introduced in PHX/WNL6.4 and can be used in the custom trans­formation.


PHX’s function chiinv(p, df) expects a p-value (here 0.95) and the degrees of freedom (df) from the model. See my previous post (answer #2) where to look in the template how it can be coded.

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
AngusMcLean
★★  

USA,
2014-12-29 00:12
(3377 d 18:48 ago)

@ Helmut
Posting: # 14190
Views: 40,266
 

 S×F vari­­ance

THANKS HELMUT: Using the FDA template RSABE with the EMEA simulated data on progesterone. I checked out the Chi-squared calculation independently in NCSS. It verified exactly the value obtained in Phoenix 6.4.

Following your suggestion I am focusing on using the RSABE workflow rather than the average BE. I copied the workflow object "Prepare......" within the RSABE worksheet and used this copy to attempt to include Wt instead of Wr in this copied object. That was my goal. It did run and I got a value so here are my values from Phoenix 6.4 and John's values from SAS are shown in parenthesis.

S2Wr=0.19931 (0.19931); S2Wt 0.11864 (0.11654) and S2Wi=0.16590 (0.16590).

We note the values are the same except for the one where I introduced the code for Wt. I am thinking there is a discrepancy in my coding. So I will look at it again.
AngusMcLean
★★  

USA,
2014-12-23 21:56
(3382 d 21:04 ago)

@ Helmut
Posting: # 14153
Views: 40,665
 

 S×F vari­­ance

Thanks Helmut: I will look at the coding we have at the moment in the RSABE sub workflow "preparing data sets for analysis" with a view to understanding the steps in the filters and transformations prior to altering R to T in the copy of the sub workflow. Then I might be successful
jag009
★★★

NJ,
2014-12-24 00:07
(3382 d 18:54 ago)

@ Helmut
Posting: # 14154
Views: 40,501
 

 S×F vari­­ance

Helmut, back from Swashbuckling? :-)

❝ John: Note that these templates will not work in PHX/WNL6.3.


I think it works but I get a warning of some sort in the beginning. I didn't have time to play around with it since PHX is a 2ndary tool for me in terms of stats (I use PHX to get my PK parameters).

I can't imagine, 3 partial AUCs + AUCinf + Cmax = 4 parameters, then add in 4 x sub*form variance tests. Freaking 8 evaluations to conclude BE. OVERKILL!!!!

John
AngusMcLean
★★  

USA,
2014-12-24 02:39
(3382 d 16:22 ago)

@ jag009
Posting: # 14155
Views: 40,603
 

 S×F vari­­ance

THANKS JOHN: Allow me to point our to you that from Barbara Davit's article if the regular BE test applies on any of the metrics (<0.294) then you do not use the reference scaled approach. You simply use the average BE test and discontinue the interest in RSABE. So that means that upper confidence interval of sigma D2 does not apply in cases where average BE is used for a metric?...yes?

The metric most at at risk with Concerta is the latest one ( I am thinking).

Angus
jag009
★★★

NJ,
2014-12-24 17:21
(3382 d 01:40 ago)

@ AngusMcLean
Posting: # 14165
Views: 40,444
 

 S×F vari­­ance

Hi Angus,

Of course the Concerta BE studies will not be RSABE based since the SwRs are all less than 0.294. So for partial AUCs + Cmax we will go through the ABE routine and conclude based on 80-125% CI. Then Helmut's suggestion above can be used to evaluate the 2nd criteria. The 2nd criteria (seems to me) is IBE/PBE based and I have no clue if you can get the info from the ABE routine. Personally I just don't see how one can fail the 2nd criteria (in my thinking) but then you have to demonstrate both... The 95% UCB I believe is for FDA to collect data, they probably want it for their own use.

Hope I answered your question.

P.S. I still don't understand the email you got from Ana regarding the computation of S2D from the G Matrix. If I recall correctly, she said to use the between subject variances of T & R + the between subject covariance for T & R???? See post :confused:

Happy Holidays!

John
AngusMcLean
★★  

USA,
2014-12-24 18:50
(3382 d 00:11 ago)

@ jag009
Posting: # 14167
Views: 40,502
 

 S×F vari­­ance

Hello John: Best Wishes for the festive season.

❝ Of course the Concerta BE studies will not be RSABE based since the SwRs are all less than 0.294.


What makes you say that? Don't you think like Ambien the first one> 0.294?

❝ So for partial AUCs + Cmax we will go through the ABE routine and conclude based on 80-125% CI. Then Helmut's suggestion above can be used to evaluate the 2nd criteria.


Could you define 2 nd criteria ..do you mean sigma D2 and the upper CB?...yes? steps 3 and 4 in the Guidance.

I am saying do we still have to evaluate the new criteria if we have shown that RSABE does not apply? That is one question I have.

❝ The 2nd criteria (seems to me) is IBE/PBE based and I have no clue if you can get the info from the ABE routine. Personally I just don't see how one can fail the 2nd criteria (in my thinking)


… but then you have to demonstrate both... The 95% UCB I believe is for FDA to collect data, they probably want it for their own use.

I agree. I was over there recently 30 minutes from here and we got questions like "what are your thoughts....." they want to use your thoughts and your data to assist them with the review process

❝ Hope I answered your question.


Please see above and clarify.

❝ P.S. I still don't understand the email you got from Ana regarding the computation of S2D from the G Matrix. If I recall correctly, she said to use the between subject variances of T & R + the between subject covariance for T & R???? See post :confused:


Your recollection of the email is correct; she is telling you to use the G matrix components and I spelled out the calculation. I think you do understand it, but you are thinking that it is not correct for application to RSABE worksheet.. Anna is not a statistical programmer. I am not sure how to follow up on that. There is a problem here.


he only thing I can think of is in the case of replicate study design when Swr is <0.294 then reference scaling is not applied. You use average BE then at that point you use Ana's formula from her email for calculation sigmaD2 and you do not use the RSABE workflow to calculate sigmaD2

Wha do think of that


Please don’t open new posts all the time. You an edit (i.e., also add sumfink) wihin 24 hours. THX. [Helmut]
AngusMcLean
★★  

USA,
2015-01-06 18:30
(3369 d 00:31 ago)

@ jag009
Posting: # 14226
Views: 39,938
 

 S×F vari­­ance

John: These are the values from Phoenix.

Par dfd Var (σ2) Cinv
WT 67 0.1186 87.1
WR 71 0.1993 91.67
WI 67 0.1659 49.16

The one that is different from your value from SAS is WT. I modifed the code in the RSABE Workflow to get WT.

You provided your values to me in an earlier note: also in an earlier message you provided me with your sigma2D value. I verified the calculation using your values of the components(see immediately below).
σ2D =σ2I-0.5*(σ2wt + σ2wr)= 0.00798


However I was not able to verify your upper confidence boundary value using your values (see below for my check).

2D = ΣEQ + (ΣU)1/2 = 0.06695

Can you check your data? I have an elegant spreadsheet here that I can send to you so you can see where my values come from.


Angus
jag009
★★★

NJ,
2015-01-08 23:42
(3366 d 19:19 ago)

@ AngusMcLean
Posting: # 14263
Views: 39,738
 

 S×F vari­­ance: Followup

Hi Angus,

Finally had some time to do this quickly. I apologize as I made a typo in my SAS code for the 95% UCB computation. My results are similar to yours


Par dfd    Var(σ2)        Cinv
WT   69  0.116539674     89.39120787
WR   71  0.199313551     91.67023918
WI   67  0.165897781     49.16227018


So, the only differences between our results are with WT. My dfd is 69 while yours is 67, and the slight difference in Var(σ2). If you count the number of subjects who has both T1 & T2 data, there are 69. I think the number of subjects used in the computation for test is the culprit (in Phoenix). For R, there are 71 subjects who completed both R1 and R2.

For me, σ2D = 0.0079711687

Using the above,

2D = ΣEQ + (ΣU)1/2 = 0.073582621

I haven't done this exercise in Phoenix though so I don't know how it didn't end up with n=69 for Test at your end.

Thanks
John
AngusMcLean
★★  

USA,
2015-01-09 02:44
(3366 d 16:17 ago)

@ jag009
Posting: # 14264
Views: 39,606
 

 S×F vari­­ance: Followup

John: Thanks for the update. I did not count missing subjects. The number of subjects who completed both R1 and R2 =73 for the full replicate data set.

I checked it three times. I did not use the subject numbers in the rows. I used the row numbers (this takes into account missing subject numbers). The subjects are numbered 1 to 78, but there are only 73 rows of subjects in the study.

The number of subjects who completed T1 and T2=69 subjects. {Four subjects missed a treatment with T2}.

So the number of degrees of freedom for R and T is 71 and 67, respectively.

That is what I have. Please can you check?

Angus
jag009
★★★

NJ,
2015-01-12 17:53
(3363 d 01:08 ago)

@ AngusMcLean
Posting: # 14271
Views: 39,573
 

 S×F vari­­ance: Followup

Hi Angus,

❝ That is what I have. Please can you check?


I pulled these counts off the SAS computational datasets (ilat, dlat for test, dlat for reference) which show the # of subjects used for each computation. You can use this to check and see if Winnonlin used the same subjects per computation?

Dlatt(for computation of Wt)
subject 1,2,3,4,5,6,7,8,9,10,12,13,14,15,16,17,18,19,21,22,23,24,25,26,27,28,29,30,31
32,33,34,35,36,37,38,39,40,41,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60
62,63,64,65,66,68,70,72,73,74,75,76,77,78

Dlatr(for computation of Wr)
subject 1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,25,26,27,28,29,30
32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60
62,63,64,65,66,68,69,70,72,73,74,75,76,77,78

Ilat(for computation of Wi)
subject 1,2,3,4,5,6,7,8,9,10,12,13,14,15,16,17,18,19,21,22,23,25,26,27,28,29,30,32,33
34,35,36,37,38,39,40,41,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,62,63,64,65
66,68,70,72,73,74,75,76,77,78


John
AngusMcLean
★★  

USA,
2015-01-12 21:56
(3362 d 21:05 ago)

@ jag009
Posting: # 14272
Views: 39,420
 

 S×F vari­­ance: Followup

❝ ❝ That is what I have. Please can you check?


❝ I pulled these counts off the SAS computational datasets (ilat, dlat for test, dlat for reference) which show the # of subjects used for each computation. You can use this to check and see if Winnonlin used the same subjects per computation?


MT
No; it did not Phoenix has 69 pairs used for data analysis and 9 subjects missing(11,20,61,67, 69,71, 42, 31 and 24.) My avoe mesage is in error. For SAS you have 24 and 31 retained compared with Phoenix. So you have 71 subjects n the calculation.

Dlatt(for computation of Wt)

subject 1,2,3,4,5,6,7,8,9,10,12,13,14,15,16,17,18,19,21,22,23,24,25,26,27,28,29,30,31

32,33,34,35,36,37,38,39,40,41,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,

59,60

62,63,64,65,66,68,70,72,73,74,75,76,77,78


Dlatr(for computation of Wr)

subject 1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,25,26,27,28,29,30

32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60

62,63,64,65,66,68,69,70,72,73,74,75,76,77,78


Ilat(for computation of Wi)

subject 1,2,3,4,5,6,7,8,9,10,12,13,14,15,16,17,18,19,21,22,23,25,26,27,28,29,30,32,33

34,35,36,37,38,39,40,41,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,62,63,64,65

66,68,70,72,73,74,75,76,77,78

Helmut
★★★
avatar
Homepage
Vienna, Austria,
2015-01-13 02:25
(3362 d 16:36 ago)

@ AngusMcLean
Posting: # 14273
Views: 39,468
 

 S×F vari­­ance: 3rd opinion

Hi Angus & John,

seems that you guys have some fun!

@Angus: I get the same complete 71 subjects (69 df) for T, 73 (71 df) for R, and 69 (67 df) as John reported in this post. From that I get exactly the variances he reported in SAS. The only difference is the Cinv for 67 df. I would get 49.16227 for p 0.05 instead of 0.95. Maybe I got sumfink wrong?
          dfd  Var_wt       Cinv
dlat(T)    69  0.116539674  89.39120787
dlat(R)    71  0.199313551  91.67023918
ilat(T–R)  67  0.165897781  87.10807220


Can you check your code a fourth time, please? When you copied the R-workflow and modified it for T, please check whether the fixed effect in intermediate dlat is indeed Sequence (and not empty!). Sometimes PHX “forgets” the model specification during copy/pasting.

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
AngusMcLean
★★  

USA,
2015-01-13 21:13
(3361 d 21:48 ago)

@ Helmut
Posting: # 14286
Views: 39,428
 

 S×F vari­­ance: 3rd opinion

Many Thanks Helmut: I will certainly check out the data and your suggestion.

Right now I am working "under the gun" and I cannot get to it today, but I will.

The chances are it is the new step adding the Test variance that is errant.

Angus
AngusMcLean
★★  

USA,
2015-01-14 22:29
(3360 d 20:31 ago)

@ Helmut
Posting: # 14290
Views: 39,279
 

 S×F vari­­ance: 3rd opinion

❝ I get the same complete 71 subjects (69 df) for T, 73 (71 df) for R, and 69 (67 df) as John reported in this post. From that I get exactly the variances he reported in SAS. The only difference is the Cinv for 67 df. I would get 49.16227 for p 0.05 instead of 0.95. Maybe I got sumfink wrong?


Helmut: 49.16 is what I have; as you say is is p 0.05 value one uses for Chisqu calculation for MI

❝ Can you check your code a fourth time, please? When you copied the R-workflow and modified it for T, please check whether the fixed effect in intermediate dlat is indeed Sequence (and not empty!). Sometimes PHX “forgets” the model specification during copy/pasting.


On the other hand the sequence is still specified in my copy of "Prepare data sets for analysis..." So it seems that the code needs to be altered from waht I have.

ANGUS
jag009
★★★

NJ,
2015-01-15 00:00
(3360 d 19:00 ago)

@ AngusMcLean
Posting: # 14291
Views: 39,285
 

 S×F vari­­ance: 3rd opinion

You guys have fun. That's why I like SAS more. I have more control (but most of the time less control!!) :-D

John
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2015-01-15 01:54
(3360 d 17:07 ago)

@ jag009
Posting: # 14292
Views: 39,330
 

 α vs. 1–α

Hi John,

❝ You guys have fun. That's why I like SAS more. I have more control (but most of the time less control!!) :-D


I think all these systems (SAS, PHX, R) can be nasty beasts. They love to be treated carefully and hate quick-shots. What I missed:

[image]

For H2/H3 1–α (like in RSABE), but for H1 α… Dammit!

What I have so far:

σ²D 0.007971169

    Eq               Hq              Uq
E1  0.165897781  H1  0.232840079  U1 0.004481271
E2 -0.058269837  H2 -0.046281491  U2 0.000143720
E3 -0.099656775  H3 -0.072837204  U3 0.000719289
Eq 0.007971169                  ∑Uq 0.005344281

Hσ²D = 0.007971169 + √0.005344281 = 0.081075759


My 0.081075759 ≠ your 0.073582621. What a mess!

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
jag009
★★★

NJ,
2015-01-15 23:38
(3359 d 19:23 ago)

@ Helmut
Posting: # 14295
Views: 39,324
 

 α vs. 1–α

Hi Helmut,

For each of the chi square values, I used in SAS:

For i, cinv(0.05,df_it);
For t, cinv(1-0.05,dfdt);
for r, cinv(1-0.05,dfdr);

Correct?

Since,
X2alpha, n-s
X21-alpha, n-s
X21-alpha, n-s


John
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2015-01-16 01:13
(3359 d 17:47 ago)

@ jag009
Posting: # 14296
Views: 39,162
 

 α vs. 1–α

Hi John,

❝ For each of the chi square values, I used in SAS: […]



According to what’s stated in the guidance, correct. Which does not imply that I understand why to use 0.95 for T & R and 0.05 for T-R. :confused:

Did you check the other results?

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
jag009
★★★

NJ,
2015-01-19 21:09
(3355 d 21:52 ago)

@ Helmut
Posting: # 14304
Views: 38,911
 

 α vs. 1–α

❝ According to what’s stated in the guidance, correct. Which does not imply that I understand why to use 0.95 for T & R and 0.05 for T-R. :confused:


Probably because of the value is derived from T-R (and therefore subject to a different alpha?) rather than from 1 entity (T, or R)?

I have no clue. Maybe I will dig around after finishing my convolution/deconvolution work at the office...

John
jag009
★★★

NJ,
2015-01-19 17:02
(3356 d 01:59 ago)

@ Helmut
Posting: # 14303
Views: 38,918
 

 α vs. 1–α

Hi Helmut,

Yours:

Eq               Hq              Uq

E1  0.165897781  H1  0.232840079  U1 0.004481271

E2 -0.058269837  H2 -0.046281491  U2 0.000143720

E3 -0.099656775  H3 -0.072837204  U3 0.000719289


Mine from SAS:
       Eq               Hq              Uq
E1  0.1658977807 H1 0.2260910911  U1 0.0036232346
E2 -0.058269837  H2 -0.044977787  U2 0.0001766786
E3 -0.099656775  H3 -0.077185694  U3 0.0005049495


Looks like we are different for H and U values and the culprit lies within H.

Here are my equations for H's:

H_1=(df_it*s2wi)/chi_wi;
H_2=(-0.5*dfdt*s2wt)/chi_wt;
H_3=(-0.5*dfdr*s2wr)/chi_wr;


Please see my previous post on the X2 statements for Wi, Wt and Wt. The n-s is different per equation(right?). df_it = DF for i, dfdt = DF for t, dfdr = DF for r. Maybe I am wrong (or you?)?

John
AngusMcLean
★★  

USA,
2015-01-30 00:38
(3345 d 18:23 ago)

@ Helmut
Posting: # 14330
Views: 38,457
 

 S×F vari­­ance: 3rd opinion

❝ @Angus: I get the same complete 71 subjects (69 df) for T, 73 (71 df) for R, and 69 (67 df) as John reported in this post. From that I get exactly the variances he reported in SAS. The only difference is the Cinv for 67 df. I would get 49.16227 for p 0.05 instead of 0.95. Maybe I got sumfink wrong?

          dfd  Var_wt       Cinv

❝ dlat(T)    69  0.116539674  89.39120787

❝ dlat(R)    71  0.199313551  91.67023918

❝ ilat(T–R)  67  0.165897781  87.10807220


❝ Can you check your code a fourth time, please? When you copied the R-workflow and modified it for T, please check whether the fixed effect in intermediate dlat is indeed Sequence (and not empty!). Sometimes PHX “forgets” the model specification during copy/pasting.


I am able to resume this work. John sent me the list of subjects he used in SAS (total 71 subjects). I looked at the list of subjects I have and it is 69 subjects. The extra subjects he has over me are numbers 24 and 31.

I looked at the object within the "Copy of Prepare data sets for analysis….." entitled Dij complete rows Data Wizard. This is the point where incomplete data is excluded by the transformation. I see the commands for the transformation and reproduce them below.

Exclude where [LOG DATA R1] is NULL entire ROW
Exclude where [LOG DATA R2] is NULL entire ROW


For the transformation I changed over to LOG DATA T1 for x column and LOG DATA T2 for the Y column

John sent me his data set for Wt calculation and I compared with the subjects in mine.

This transformation excludes subjects 24 and 31 retained in John's data set so I get 69 instead of 71 subjects. Is it the correct exclusion criteria to apply?

The fixed effect is set to sequence in intermediate file mentioned above.

Angus
AngusMcLean
★★  

USA,
2015-01-31 00:24
(3344 d 18:37 ago)

@ jag009
Posting: # 14333
Views: 38,498
 

 S×F vari­­ance: Followup

❝ I pulled these counts off the SAS computational datasets (ilat, dlat for test, dlat for reference) which show the # of subjects used for each computation. You can use this to check and see if Winnonlin used the same subjects per computation?


Dlatt(for computation of Wt)

subject 1,2,3,4,5,6,7,8,9,10,12,13,14,15,16,17,18,19,21,22,23,24,25,26,27,28,29,30,31

32,33,34,35,36,37,38,39,40,41,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60

62,63,64,65,66,68,70,72,73,74,75,76,77,78


John: You have 71 subjects (69 df) for Wt: I have 69 subjects (67 df) ......the 2 subjects you include in SAS and I exclude in Phoenix are 24 and 31. So the question arises what are the rules in your code that allow you to include them in SAS.

Excluding then in Phoenix follows the code in Phoenix.

Angus
jag009
★★★

NJ,
2015-02-02 17:31
(3342 d 01:30 ago)

@ AngusMcLean
Posting: # 14354
Views: 38,145
 

 S×F vari­­ance: Followup

Angus,

❝ John: You have 71 subjects (69 df) for Wt: I have 69 subjects (67 df) ......the 2 subjects you include in SAS and I exclude in Phoenix are 24 and 31. So the question arises what are the rules in your code that allow you to include them in SAS.


Before we proceed. Are you in agreement with my subject listing for Dlatr and ilat? I looked at Dlatt (wt) listing and the SAS dataset and noted the following:

Subject 24 only has TTR (missing one R), and subject 31 has RTT (missing one R). Why would Winnonlin drop these two subjects? Both have 2 Ts and hence Dlatt should've been calculated just like the rest of the qualified subjects.

John

P.S. I am still waiting for Helmut's response the H and u calculations...
AngusMcLean
★★  

USA,
2015-02-02 18:22
(3342 d 00:39 ago)

(edited by AngusMcLean on 2015-02-02 22:16)
@ jag009
Posting: # 14355
Views: 38,176
 

 S×F vari­­ance: Followup

❝ ❝ John: You have 71 subjects (69 df) for Wt: I have 69 subjects (67 df) ......the 2 subjects you include in SAS and I exclude in Phoenix are 24 and 31. So the question arises what are the rules in your code that allow you to include them in SAS.


❝ Before we proceed. Are you in agreement with my subject listing for Dlatr and ilat? I looked at Dlatt (wt) listing and the SAS dataset and noted the following:


❝ Subject 24 only has TTR (missing one R), and subject 31 has RTT (missing one R). Why would Winnonlin drop these two subjects? Both have 2 Ts and hence Dlatt should've been calculated just like the rest of the qualified subjects.


❝ John


❝ P.S. I am still waiting for Helmut's response the H and u calculations...


John: Thank you. The only difference was in the WT result. What helped me out was you gave me the listing you used for WT. Because you used subject 24 and 31 then that gave me the clue as to how to modify the Phoenix workflow to get the same result (and correct one) agreeing with your (and Helmut's results). The subjects with replicate values available for T (e.g 24, 31) are now included since I modified the transformation criteria to accept subjects with replicate T values. Also the calculation is now for test ratios not reference. I did not bother about using a copy of "Prepare Data Sets for Analysis ..." I modified the standard one that comes with the worksheet and I got almost identical results (WT 0.1165394 and Chi 89.391268 to Helmut). I have a spreadsheet in Excel that does the next steps (my results below). I did round my values to 4 decimals (should have used them all).
H σ2D = ΣEQ + (ΣU)1/2 =0.06696. I can send you the spreadsheet if you like: it is set up like the Guidance and you can check-confirm the steps sequentially. Alternatively send me the values you have for the variance parameters (with you decimals) and I will calculate the final steps in the spreadsheet sequentially. If I assume your prior values are still current
Par dfd    Var(σ2)        Cinv
WT   69  0.116539674     89.39120787
WR   71  0.199313551     91.67023918
WI   67  0.165897781     49.16227018
Then I get:
σ2D2I-0.5*(σ2wt + σ2wr)= 0.00798
and
H σ2D = ΣEQ + (ΣU)1/2 =0.06692

Note: my value for H1 (MI) is 0.11305 and uses CHi value of 49.16227018; it is smaller than your value since CHI (p=0.05) is 49.16227018

Angus
jag009
★★★

NJ,
2015-02-05 19:14
(3338 d 23:47 ago)

(edited by jag009 on 2015-02-05 21:10)
@ AngusMcLean
Posting: # 14381
Views: 37,888
 

 S×F vari­­ance: Followup

Thanks Angus, You finally crossed the finishing line! :-D

Here are my #s again:

Mine from SAS:

       Eq               Hq              Uq
E1  0.1658977807 H1 0.2260910911  U1 0.0036232346
E2 -0.058269837  H2 -0.044977787  U2 0.0001766786
E3 -0.099656775  H3 -0.077185694  U3 0.0005049495


My Chi square values were:

Wi             Wt             Wr
49.162270179   89.391207873   91.670239176


Can you present all your values (the above ones)?
You value for Hσ2D=ΣEQ+(ΣU)1/2=0.06692 is still different from mine->0.0735826211

My Chi Square equations:

For i, cinv(0.05,df_it); X2alpha, n-s
For t, cinv(1-0.05,dfdt); X21-alpha, n-s
for r, cinv(1-0.05,dfdr); X21-alpha, n-s

What values do you use for n-s?

Our S2D more or less are the same, mine was 0.0079711687

Thanks

John.

P.S. Can you send me your spreadsheet? Interested in seeing how yours look like. You know how to email me right?
AngusMcLean
★★  

USA,
2015-02-06 02:39
(3338 d 16:22 ago)

@ jag009
Posting: # 14382
Views: 37,970
 

 S×F vari­­ance: Followup

❝ P.S. Can you send me your spreadsheet? Interested in seeing how yours look like. You know how to email me right?


I will send you the spreadsheet tomorrow: the structure matches the Guidance document. Please check the H1 value in your work and my spreadsheet: that is the difference.


Edit: Full quote removed. Please delete everything from the text of the original poster which is not necessary in understanding your answer; see also this post! [Helmut]
jag009
★★★

NJ,
2015-02-06 18:13
(3338 d 00:48 ago)

@ AngusMcLean
Posting: # 14383
Views: 37,720
 

 S×F vari­­ance: Followup

Hi Angus,

Something is wrong with your computation of H1

We agreed on MI=0.165897781, dfd=67, X2p=0.05,67=49.16227018

From your previous post:

❝ Note: my value for H1 (MI) is 0.11305 and uses CHi value of 49.16227018; it is smaller than your value since CHI (p=0.05) IS 49.16227018


From FDA's equation: H1=(n-s)*MI /X2alpha,n-s

How did you end up with H1=0.11305? I reversed the equation using your H1 value to solve for (n-s) and ended up with (n-s)=33.5.

Looks like your H1 equation is wrong (you used the H2 and H3 equations). FDA's H2 and H3 equations are the same except H2 uses MT and H3 uses MR

H= -0.5*(n-s)*M# / X2alpha,n-s, where # = T or R

If you use the correct H1 equation then H1=0.226091092, and the final HQ2D equals to my value of 0.073583
AngusMcLean
★★  

USA,
2015-02-06 20:36
(3337 d 22:25 ago)

@ jag009
Posting: # 14384
Views: 37,712
 

 S×F vari­­ance: Followup

Thanks John: you are correct! the answer to your question is that I used a factor 0.5 in that equation {H1}. and that does not belong there as is evident in the source FDA Guidance. It is the other two parameters that are modified by the 0.5 multiplier.

Now your data checks out on my spreadsheet, since as you say the final value is

0.07358

The Phoenix calculation of WT was done on an adhoc basis by modifying the existing code (WR). What should be done is that workflow(S) should be created such that both WT and WR can be obtained from the same workflow or workflows. That way you do not need to modify the WR code each time you need WT as well as WR. After it is very relevant to have a comparison of WT and WR when comparing two formulations.

Therefore my spreadsheet now validates your calculations and I have corrected it.


Edit: Full quote removed. Please delete everything from the text of the original poster which is not necessary in understanding your answer; see also this post! [Helmut]
jag009
★★★

NJ,
2015-02-17 23:49
(3326 d 19:11 ago)

@ AngusMcLean
Posting: # 14450
Views: 36,848
 

 S×F vari­­ance: Follow­up

Angus,

Just curious. Do you think we got the n-s term correctly identified? Is it different for MI, MT and MR or there should only be one (n-s)?

Thanks
John
AngusMcLean
★★  

USA,
2015-02-20 00:25
(3324 d 18:36 ago)

@ jag009
Posting: # 14465
Views: 36,710
 

 S×F vari­­ance: Follow­up

John: I have them different as you see below for my Excel spreadsheet: surely they are different, since the n number changes with each parameter?

How about the Prof. in Toronto what did he say about that question?

Paramater   dfd    Var (σ2)       Cinv
WT         69.00  0.11653967  89.39120787
WR         71.00  0.19931355  91.67023918
WI         67.00  0.16589778  49.16227018


Angus
jag009
★★★

NJ,
2015-02-21 22:08
(3322 d 20:52 ago)

@ AngusMcLean
Posting: # 14487
Views: 36,660
 

 S×F vari­­ance: Follow­up

Hi Angus,

I won't be seeing him until March. I will definitely ask him about this.
He took a look at the Proc Mixed G-Matrix route to get within subject T, R and I, but he didn't comment in details. He did say the #s can be obtained from that route.

I just don't like the wordings FDA stated in the draft guidance. It looks like an open ended question -> "Q2D has an allowance of 0.03". They want us to present both Q2D and 95% UBC, but the ending statement seems to indicate that they want to see Q2D no more than 0.03 rather than both Q2D and 95% UCB no more than 0.03. They need some sort of English 101 training???

John
AngusMcLean
★★  

USA,
2015-02-22 18:25
(3322 d 00:35 ago)

(edited by AngusMcLean on 2015-02-22 20:27)
@ jag009
Posting: # 14488
Views: 36,518
 

 S×F vari­­ance: Follow­up

John: I am not clear on the G matrix: At an earlier date I thought it was my salvation. I do not understand why it is not to be used. I do know that when I used the way I thought was appropriate it gave a ridiculous result. But then it is likely that I am not using in an appropriate way.

Both Linda and Ana are aware where we are but have not commented on how to use the G matrix. I have emailed them; Linda has expressed an interest in the work you did with Helmut and this thread.

Maybe the Prof. in Toronto can shed some light on this confusion: one thing I did do was create set of replicate ABAB MP data based on actual data I have and ran it in Phoenix. I have found it is difficult to fake suitable data so. It did run OK for bioequivalence, but as yet I have not pursued the steps 2, 3 and 4. The ad hoc changes I make on the existing template to give WT could be used. I need to figure out a way of modifying the template to have a standard template to produce WT and WR variance. It is difficult to do this if you did not write the code in the first place.

Regarding the wording of the recent MP Guidance then what I see is the Guidance has many people reviewing and editing it sentence by sentence. It is confusing in the part you focus on. I think they are not real sure what position to take and they want to get the sponsors to produce data for them, so that after they have enough data then they can formulate a rational policy. As you have said yourself I do not see them failing a submission on that part of the Guidance.


Angus
jag009
★★★

NJ,
2015-02-23 17:58
(3321 d 01:03 ago)

@ AngusMcLean
Posting: # 14489
Views: 36,451
 

 S×F vari­­ance: Follow­up

Angus,

That is my concern because it seems like (I have to revisit to confirm) the Wi obtained from G-Matrix is different from FDA's.

❝ Regarding the wording of the recent MP Guidance then what I see is the Guidance has many people reviewing and editing it sentence by sentence. It is confusing in the part you focus on. I think they are not real sure what position to take and they want to get the sponsors to produce data for them, so that after they have enough data then they can formulate a rational policy. As you have said yourself I do not see them failing a submission on that part of the Guidance.


So sponsors will be guinea pigs! This will be big guinea pigs!

John
AngusMcLean
★★  

USA,
2015-01-06 19:48
(3368 d 23:13 ago)

@ Helmut
Posting: # 14228
Views: 39,900
 

 S×F vari­­ance

Helmut: my interest in Phoenix was to get within subject test variance (WT instead of WR. Following your suggestion I copied and pasted "Prepare data sets for analysis" to create a second one.

Therefore I changed the code over in the second one from R2 and R1 to T2 and T1 as shown in the photo below: Subsequently there was little changes required {just to the text} to get WT]

[image]

it seemed to work well: I note that the value obtained (0.1186) was slightly higher than John's value (0.11654). Of course the WR information is retained in the original "Prepare Data sets for analysis"

I cannot see anything wrong; it seemed to be straight forward, but I am interested in why John got a different value in SAS. I think he experienced a problem doing this in Phoenix. I hope he see this.


Angus
jag009
★★★

NJ,
2015-01-07 22:28
(3367 d 20:33 ago)

@ AngusMcLean
Posting: # 14241
Views: 39,892
 

 S×F vari­­ance

Hi Angus,

Sorry for M.I.A. Have a buffet full of crap to do here since I came back. Will get back to you before end of week? :-)

John

P.S. Helmut! Had a good chat with Laszlo over dinner when I was in Toronto. He was on fire "literally" :-) No we didn't drink Absinthe.
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2015-01-08 17:10
(3367 d 01:51 ago)

@ jag009
Posting: # 14256
Views: 39,788
 

 S×F vari­­ance

Hi John

❝ Had a good chat with Laszlo over dinner when I was in Toronto. He was on fire "literally" :-)


I can imagine. He send me a copy of comments he filed together with László Tóthfalusi at the FDA.
He further wrote: “I expect that, following its habits, FDA will not respond and will not move.”

❝ No we didn't drink Absinthe.

What a shame. [image]

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
jag009
★★★

NJ,
2015-01-08 18:32
(3367 d 00:29 ago)

(edited by jag009 on 2015-01-08 19:49)
@ Helmut
Posting: # 14257
Views: 39,736
 

 S×F vari­­ance: Typo.

Hi Helmut,

He basically said computation #2 is pointless at this stage. He said the 0.03 is NOT a criteria, it's an "allowance". In that case then Computation #2 is just for data collection... Also I brought up the S2D computation suggested by Ana (see my post to Angus in this thread) using ABE output (G-matrix). He said it is nonsense (I assume I interpreted the equations correctly to him. I did tell him her computation involves using between-subject variances).

Nope, no Absinthe, just a glass of wine :-)
On another note, ever tried a liquor called "Ever Clear"? 190 Proof... :surprised:

John
AngusMcLean
★★  

USA,
2015-01-08 19:51
(3366 d 23:10 ago)

@ jag009
Posting: # 14259
Views: 39,759
 

 S×F vari­­ance

Hello John: I am confused by Ana's material. You may find it interesting to look at page 15, 16 of below lecture from a bio statistician.

[image] webarchive

Angus
jag009
★★★

NJ,
2015-01-08 21:02
(3366 d 21:59 ago)

@ AngusMcLean
Posting: # 14260
Views: 39,593
 

 S×F vari­­ance

Thanks Angus,

Slides 15 and 16 are strange... 15 shows SD while 16 shows S2D. Typos?

John
AngusMcLean
★★  

USA,
2015-01-08 21:27
(3366 d 21:33 ago)

@ jag009
Posting: # 14261
Views: 39,620
 

 S×F vari­­ance

❝ Slides 15 and 16 are strange... 15 shows SD while 16 shows S2D. Typos?


I think the first one is the standard deviation and the second one is the variance.
jag009
★★★

NJ,
2015-01-08 21:33
(3366 d 21:28 ago)

@ AngusMcLean
Posting: # 14262
Views: 39,691
 

 S×F vari­­ance

Oops, read too fast.

I am not getting it. Okay, I am no Biostat expert...

John
UA Flag
Activity
 Admin contact
22,957 posts in 4,819 threads, 1,636 registered users;
77 visitors (0 registered, 77 guests [including 7 identified bots]).
Forum time: 19:01 CET (Europe/Vienna)

Nothing shows a lack of mathematical education more
than an overly precise calculation.    Carl Friedrich Gauß

The Bioequivalence and Bioavailability Forum is hosted by
BEBAC Ing. Helmut Schütz
HTML5