Helmut ★★★ Vienna, Austria, 2014-12-23 16:32 (3410 d 20:31 ago) Posting: # 14150 Views: 43,042 |
|
Dear all – or shall I rather say Angus & John? Since the other thread (due to indenting of posts) becomes increasingly difficult to read I decided to close it. Please continue here. Regrettably I had too little spare time to dive into. However, some points:
— Dif-tor heh smusma 🖖🏼 Довге життя Україна! Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. 🚮 Science Quotes |
AngusMcLean ★★ USA, 2014-12-23 17:32 (3410 d 19:31 ago) @ Helmut Posting: # 14151 Views: 40,887 |
|
Thanks Helmut: I understand immediately below very well ❝ Regrettably I had too little spare time to dive into. However, some points: ❝ 1. #14138: John is correct that the setup should essentially follow FDA’s code for RSABE. Therefore, anything obtained from ABE is not useful. But I cannot reconcile it with immediately below, since this pertains to the average BE part of the template so why do it if you are focusing on RSABE? ❝ Modifying the ABE-part doesn’t do the job (see #1 above). What you could do: Copy the entire Also I had planned on doing the three chi-squared calculations in NCSS described in Part 4 of the new Methylphendiate Guidance (2014). I am puzzled by how you can do this in Phoenix (see below) ❝ Ana’s new templates removed the clumsy workaround for joining values of the χ², since Angus |
Helmut ★★★ Vienna, Austria, 2014-12-23 18:17 (3410 d 18:46 ago) @ AngusMcLean Posting: # 14152 Views: 40,837 |
|
Hi Angus, ❝ But I cannot reconcile it with immediately below, since this pertains to the average BE part of the template so why do it if you are focusing on RSABE? I meant the sub-WF of the RSABE-WF. I stated “ Prepare Data for RSABE analysis ”, right?❝ Also I had planned on doing the three chi-squared calculations in NCSS described in Part 4 of the new Methylphendiate Guidance (2014). I am puzzled by how you can do this in Phoenix (see below) ❝ ❝ ❝ Ana’s new templates removed the clumsy workaround for joining values of the χ², since PHX’s function chiinv(p, df) expects a p-value (here 0.95) and the degrees of freedom (df) from the model. See my previous post (answer #2) where to look in the template how it can be coded.— Dif-tor heh smusma 🖖🏼 Довге життя Україна! Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. 🚮 Science Quotes |
AngusMcLean ★★ USA, 2014-12-29 00:12 (3405 d 12:51 ago) @ Helmut Posting: # 14190 Views: 40,405 |
|
THANKS HELMUT: Using the FDA template RSABE with the EMEA simulated data on progesterone. I checked out the Chi-squared calculation independently in NCSS. It verified exactly the value obtained in Phoenix 6.4. Following your suggestion I am focusing on using the RSABE workflow rather than the average BE. I copied the workflow object "Prepare......" within the RSABE worksheet and used this copy to attempt to include Wt instead of Wr in this copied object. That was my goal. It did run and I got a value so here are my values from Phoenix 6.4 and John's values from SAS are shown in parenthesis. S2Wr=0.19931 (0.19931); S2Wt 0.11864 (0.11654) and S2Wi=0.16590 (0.16590). We note the values are the same except for the one where I introduced the code for Wt. I am thinking there is a discrepancy in my coding. So I will look at it again. |
AngusMcLean ★★ USA, 2014-12-23 21:56 (3410 d 15:07 ago) @ Helmut Posting: # 14153 Views: 40,805 |
|
Thanks Helmut: I will look at the coding we have at the moment in the RSABE sub workflow "preparing data sets for analysis" with a view to understanding the steps in the filters and transformations prior to altering R to T in the copy of the sub workflow. Then I might be successful |
jag009 ★★★ NJ, 2014-12-24 00:07 (3410 d 12:56 ago) @ Helmut Posting: # 14154 Views: 40,638 |
|
Helmut, back from Swashbuckling? ❝ John: Note that these templates will not work in PHX/WNL6.3. I think it works but I get a warning of some sort in the beginning. I didn't have time to play around with it since PHX is a 2ndary tool for me in terms of stats (I use PHX to get my PK parameters). I can't imagine, 3 partial AUCs + AUCinf + Cmax = 4 parameters, then add in 4 x sub*form variance tests. Freaking 8 evaluations to conclude BE. OVERKILL!!!! John |
AngusMcLean ★★ USA, 2014-12-24 02:39 (3410 d 10:24 ago) @ jag009 Posting: # 14155 Views: 40,741 |
|
THANKS JOHN: Allow me to point our to you that from Barbara Davit's article if the regular BE test applies on any of the metrics (<0.294) then you do not use the reference scaled approach. You simply use the average BE test and discontinue the interest in RSABE. So that means that upper confidence interval of sigma D2 does not apply in cases where average BE is used for a metric?...yes? The metric most at at risk with Concerta is the latest one ( I am thinking). Angus |
jag009 ★★★ NJ, 2014-12-24 17:21 (3409 d 19:42 ago) @ AngusMcLean Posting: # 14165 Views: 40,582 |
|
Hi Angus, Of course the Concerta BE studies will not be RSABE based since the SwRs are all less than 0.294. So for partial AUCs + Cmax we will go through the ABE routine and conclude based on 80-125% CI. Then Helmut's suggestion above can be used to evaluate the 2nd criteria. The 2nd criteria (seems to me) is IBE/PBE based and I have no clue if you can get the info from the ABE routine. Personally I just don't see how one can fail the 2nd criteria (in my thinking) but then you have to demonstrate both... The 95% UCB I believe is for FDA to collect data, they probably want it for their own use. Hope I answered your question. P.S. I still don't understand the email you got from Ana regarding the computation of S2D from the G Matrix. If I recall correctly, she said to use the between subject variances of T & R + the between subject covariance for T & R???? See post Happy Holidays! John |
AngusMcLean ★★ USA, 2014-12-24 18:50 (3409 d 18:13 ago) @ jag009 Posting: # 14167 Views: 40,640 |
|
Hello John: Best Wishes for the festive season. ❝ Of course the Concerta BE studies will not be RSABE based since the SwRs are all less than 0.294. What makes you say that? Don't you think like Ambien the first one> 0.294? ❝ So for partial AUCs + Cmax we will go through the ABE routine and conclude based on 80-125% CI. Then Helmut's suggestion above can be used to evaluate the 2nd criteria. Could you define 2 nd criteria ..do you mean sigma D2 and the upper CB?...yes? steps 3 and 4 in the Guidance. I am saying do we still have to evaluate the new criteria if we have shown that RSABE does not apply? That is one question I have. ❝ The 2nd criteria (seems to me) is IBE/PBE based and I have no clue if you can get the info from the ABE routine. Personally I just don't see how one can fail the 2nd criteria (in my thinking) … but then you have to demonstrate both... The 95% UCB I believe is for FDA to collect data, they probably want it for their own use. I agree. I was over there recently 30 minutes from here and we got questions like "what are your thoughts....." they want to use your thoughts and your data to assist them with the review process ❝ Hope I answered your question. Please see above and clarify. ❝ P.S. I still don't understand the email you got from Ana regarding the computation of S2D from the G Matrix. If I recall correctly, she said to use the between subject variances of T & R + the between subject covariance for T & R???? See post Your recollection of the email is correct; she is telling you to use the G matrix components and I spelled out the calculation. I think you do understand it, but you are thinking that it is not correct for application to RSABE worksheet.. Anna is not a statistical programmer. I am not sure how to follow up on that. There is a problem here. he only thing I can think of is in the case of replicate study design when Swr is <0.294 then reference scaling is not applied. You use average BE then at that point you use Ana's formula from her email for calculation sigmaD2 and you do not use the RSABE workflow to calculate sigmaD2 Wha do think of that Please don’t open new posts all the time. You an edit (i.e., also add sumfink) wihin 24 hours. THX. [Helmut] |
AngusMcLean ★★ USA, 2015-01-06 18:30 (3396 d 18:33 ago) @ jag009 Posting: # 14226 Views: 40,078 |
|
John: These are the values from Phoenix. Par dfd Var (σ2) Cinv WT 67 0.1186 87.1 WR 71 0.1993 91.67 WI 67 0.1659 49.16 The one that is different from your value from SAS is WT. I modifed the code in the RSABE Workflow to get WT. You provided your values to me in an earlier note: also in an earlier message you provided me with your sigma2D value. I verified the calculation using your values of the components(see immediately below). σ2D =σ2I-0.5*(σ2wt + σ2wr)= 0.00798 However I was not able to verify your upper confidence boundary value using your values (see below for my check). Hσ2D = ΣEQ + (ΣU)1/2 = 0.06695 Can you check your data? I have an elegant spreadsheet here that I can send to you so you can see where my values come from. Angus |
jag009 ★★★ NJ, 2015-01-08 23:42 (3394 d 13:21 ago) @ AngusMcLean Posting: # 14263 Views: 39,877 |
|
Hi Angus, Finally had some time to do this quickly. I apologize as I made a typo in my SAS code for the 95% UCB computation. My results are similar to yours Par dfd Var(σ2) Cinv So, the only differences between our results are with WT. My dfd is 69 while yours is 67, and the slight difference in Var(σ2). If you count the number of subjects who has both T1 & T2 data, there are 69. I think the number of subjects used in the computation for test is the culprit (in Phoenix). For R, there are 71 subjects who completed both R1 and R2. For me, σ2D = 0.0079711687 Using the above, Hσ2D = ΣEQ + (ΣU)1/2 = 0.073582621 I haven't done this exercise in Phoenix though so I don't know how it didn't end up with n=69 for Test at your end. Thanks John |
AngusMcLean ★★ USA, 2015-01-09 02:44 (3394 d 10:19 ago) @ jag009 Posting: # 14264 Views: 39,744 |
|
John: Thanks for the update. I did not count missing subjects. The number of subjects who completed both R1 and R2 =73 for the full replicate data set. I checked it three times. I did not use the subject numbers in the rows. I used the row numbers (this takes into account missing subject numbers). The subjects are numbered 1 to 78, but there are only 73 rows of subjects in the study. The number of subjects who completed T1 and T2=69 subjects. {Four subjects missed a treatment with T2}. So the number of degrees of freedom for R and T is 71 and 67, respectively. That is what I have. Please can you check? Angus |
jag009 ★★★ NJ, 2015-01-12 17:53 (3390 d 19:10 ago) @ AngusMcLean Posting: # 14271 Views: 39,713 |
|
Hi Angus, ❝ That is what I have. Please can you check? I pulled these counts off the SAS computational datasets (ilat, dlat for test, dlat for reference) which show the # of subjects used for each computation. You can use this to check and see if Winnonlin used the same subjects per computation? Dlatt(for computation of Wt) John |
AngusMcLean ★★ USA, 2015-01-12 21:56 (3390 d 15:07 ago) @ jag009 Posting: # 14272 Views: 39,558 |
|
❝ ❝ That is what I have. Please can you check? ❝ ❝ I pulled these counts off the SAS computational datasets (ilat, dlat for test, dlat for reference) which show the # of subjects used for each computation. You can use this to check and see if Winnonlin used the same subjects per computation? MT No; it did not Phoenix has 69 pairs used for data analysis and 9 subjects missing(11,20,61,67, 69,71, 42, 31 and 24.) My avoe mesage is in error. For SAS you have 24 and 31 retained compared with Phoenix. So you have 71 subjects n the calculation. ❝ ❝ ❝ ❝ ❝ ❝ ❝ ❝ ❝ ❝ ❝ ❝ ❝ ❝ |
Helmut ★★★ Vienna, Austria, 2015-01-13 02:25 (3390 d 10:38 ago) @ AngusMcLean Posting: # 14273 Views: 39,606 |
|
Hi Angus & John, seems that you guys have some fun! @Angus: I get the same complete 71 subjects (69 df) for T, 73 (71 df) for R, and 69 (67 df) as John reported in this post. From that I get exactly the variances he reported in SAS. The only difference is the Cinv for 67 df. I would get 49.16227 for p 0.05 instead of 0.95. Maybe I got sumfink wrong? dfd Var_wt Cinv Can you check your code a fourth time, please? When you copied the R-workflow and modified it for T, please check whether the fixed effect in intermediate dlat is indeed Sequence (and not empty!). Sometimes PHX “forgets” the model specification during copy/pasting.— Dif-tor heh smusma 🖖🏼 Довге життя Україна! Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. 🚮 Science Quotes |
AngusMcLean ★★ USA, 2015-01-13 21:13 (3389 d 15:50 ago) @ Helmut Posting: # 14286 Views: 39,568 |
|
Many Thanks Helmut: I will certainly check out the data and your suggestion. Right now I am working "under the gun" and I cannot get to it today, but I will. The chances are it is the new step adding the Test variance that is errant. Angus |
AngusMcLean ★★ USA, 2015-01-14 22:29 (3388 d 14:34 ago) @ Helmut Posting: # 14290 Views: 39,418 |
|
❝ I get the same complete 71 subjects (69 df) for T, 73 (71 df) for R, and 69 (67 df) as John reported in this post. From that I get exactly the variances he reported in SAS. The only difference is the Cinv for 67 df. I would get 49.16227 for p 0.05 instead of 0.95. Maybe I got sumfink wrong? Helmut: 49.16 is what I have; as you say is is p 0.05 value one uses for Chisqu calculation for MI ❝ Can you check your code a fourth time, please? When you copied the R-workflow and modified it for T, please check whether the fixed effect in On the other hand the sequence is still specified in my copy of "Prepare data sets for analysis..." So it seems that the code needs to be altered from waht I have. ANGUS |
jag009 ★★★ NJ, 2015-01-15 00:00 (3388 d 13:03 ago) @ AngusMcLean Posting: # 14291 Views: 39,423 |
|
You guys have fun. That's why I like SAS more. I have more control (but most of the time less control!!) John |
Helmut ★★★ Vienna, Austria, 2015-01-15 01:54 (3388 d 11:09 ago) @ jag009 Posting: # 14292 Views: 39,468 |
|
Hi John, ❝ You guys have fun. That's why I like SAS more. I have more control (but most of the time less control!!) I think all these systems (SAS, PHX, R) can be nasty beasts. They love to be treated carefully and hate quick-shots. What I missed: For H2/H3 1–α (like in RSABE), but for H1 α… Dammit! What I have so far: σ²D 0.007971169 My 0.081075759 ≠ your 0.073582621. What a mess! — Dif-tor heh smusma 🖖🏼 Довге життя Україна! Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. 🚮 Science Quotes |
jag009 ★★★ NJ, 2015-01-15 23:38 (3387 d 13:25 ago) @ Helmut Posting: # 14295 Views: 39,462 |
|
Hi Helmut, For each of the chi square values, I used in SAS: For i, cinv(0.05,df_it); For t, cinv(1-0.05,dfdt); for r, cinv(1-0.05,dfdr); Correct? Since, X2alpha, n-s X21-alpha, n-s X21-alpha, n-s John |
Helmut ★★★ Vienna, Austria, 2015-01-16 01:13 (3387 d 11:50 ago) @ jag009 Posting: # 14296 Views: 39,300 |
|
Hi John, ❝ For each of the chi square values, I used in SAS: […] According to what’s stated in the guidance, correct. Which does not imply that I understand why to use 0.95 for T & R and 0.05 for T-R. Did you check the other results? — Dif-tor heh smusma 🖖🏼 Довге життя Україна! Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. 🚮 Science Quotes |
jag009 ★★★ NJ, 2015-01-19 21:09 (3383 d 15:54 ago) @ Helmut Posting: # 14304 Views: 39,050 |
|
❝ According to what’s stated in the guidance, correct. Which does not imply that I understand why to use 0.95 for T & R and 0.05 for T-R. Probably because of the value is derived from T-R (and therefore subject to a different alpha?) rather than from 1 entity (T, or R)? I have no clue. Maybe I will dig around after finishing my convolution/deconvolution work at the office... John |
jag009 ★★★ NJ, 2015-01-19 17:02 (3383 d 20:01 ago) @ Helmut Posting: # 14303 Views: 39,054 |
|
Hi Helmut, Yours: ❝ ❝ ❝ ❝ Mine from SAS: Eq Hq Uq Looks like we are different for H and U values and the culprit lies within H. Here are my equations for H's: H_1=(df_it*s2wi)/chi_wi; Please see my previous post on the X2 statements for Wi, Wt and Wt. The n-s is different per equation(right?). df_it = DF for i, dfdt = DF for t, dfdr = DF for r. Maybe I am wrong (or you?)? John |
AngusMcLean ★★ USA, 2015-01-30 00:38 (3373 d 12:25 ago) @ Helmut Posting: # 14330 Views: 38,595 |
|
❝ @Angus: I get the same complete 71 subjects (69 df) for T, 73 (71 df) for R, and 69 (67 df) as John reported in this post. From that I get exactly the variances he reported in SAS. The only difference is the Cinv for 67 df. I would get 49.16227 for p 0.05 instead of 0.95. Maybe I got sumfink wrong? ❝ ❝ dlat(T) 69 0.116539674 89.39120787 ❝ dlat(R) 71 0.199313551 91.67023918 ❝ ilat(T–R) 67 0.165897781 87.10807220 ❝ ❝ Can you check your code a fourth time, please? When you copied the R-workflow and modified it for T, please check whether the fixed effect in I am able to resume this work. John sent me the list of subjects he used in SAS (total 71 subjects). I looked at the list of subjects I have and it is 69 subjects. The extra subjects he has over me are numbers 24 and 31. I looked at the object within the "Copy of Prepare data sets for analysis….." entitled Dij complete rows Data Wizard. This is the point where incomplete data is excluded by the transformation. I see the commands for the transformation and reproduce them below. Exclude where [LOG DATA R1] is NULL entire ROW For the transformation I changed over to LOG DATA T1 for x column and LOG DATA T2 for the Y column John sent me his data set for Wt calculation and I compared with the subjects in mine. This transformation excludes subjects 24 and 31 retained in John's data set so I get 69 instead of 71 subjects. Is it the correct exclusion criteria to apply? The fixed effect is set to sequence in intermediate file mentioned above. Angus |
AngusMcLean ★★ USA, 2015-01-31 00:24 (3372 d 12:39 ago) @ jag009 Posting: # 14333 Views: 38,637 |
|
❝ I pulled these counts off the SAS computational datasets (ilat, dlat for test, dlat for reference) which show the # of subjects used for each computation. You can use this to check and see if Winnonlin used the same subjects per computation? ❝ ❝ ❝ ❝ ❝ John: You have 71 subjects (69 df) for Wt: I have 69 subjects (67 df) ......the 2 subjects you include in SAS and I exclude in Phoenix are 24 and 31. So the question arises what are the rules in your code that allow you to include them in SAS. Excluding then in Phoenix follows the code in Phoenix. Angus |
jag009 ★★★ NJ, 2015-02-02 17:31 (3369 d 19:32 ago) @ AngusMcLean Posting: # 14354 Views: 38,283 |
|
Angus, ❝ John: You have 71 subjects (69 df) for Wt: I have 69 subjects (67 df) ......the 2 subjects you include in SAS and I exclude in Phoenix are 24 and 31. So the question arises what are the rules in your code that allow you to include them in SAS. Before we proceed. Are you in agreement with my subject listing for Dlatr and ilat? I looked at Dlatt (wt) listing and the SAS dataset and noted the following: Subject 24 only has TTR (missing one R), and subject 31 has RTT (missing one R). Why would Winnonlin drop these two subjects? Both have 2 Ts and hence Dlatt should've been calculated just like the rest of the qualified subjects. John P.S. I am still waiting for Helmut's response the H and u calculations... |
AngusMcLean ★★ USA, 2015-02-02 18:22 (3369 d 18:41 ago) (edited by AngusMcLean on 2015-02-02 22:16) @ jag009 Posting: # 14355 Views: 38,317 |
|
❝ ❝ John: You have 71 subjects (69 df) for Wt: I have 69 subjects (67 df) ......the 2 subjects you include in SAS and I exclude in Phoenix are 24 and 31. So the question arises what are the rules in your code that allow you to include them in SAS. ❝ ❝ Before we proceed. Are you in agreement with my subject listing for Dlatr and ilat? I looked at Dlatt (wt) listing and the SAS dataset and noted the following: ❝ ❝ Subject 24 only has TTR (missing one R), and subject 31 has RTT (missing one R). Why would Winnonlin drop these two subjects? Both have 2 Ts and hence Dlatt should've been calculated just like the rest of the qualified subjects. ❝ ❝ John ❝ ❝ P.S. I am still waiting for Helmut's response the H and u calculations... John: Thank you. The only difference was in the WT result. What helped me out was you gave me the listing you used for WT. Because you used subject 24 and 31 then that gave me the clue as to how to modify the Phoenix workflow to get the same result (and correct one) agreeing with your (and Helmut's results). The subjects with replicate values available for T (e.g 24, 31) are now included since I modified the transformation criteria to accept subjects with replicate T values. Also the calculation is now for test ratios not reference. I did not bother about using a copy of "Prepare Data Sets for Analysis ..." I modified the standard one that comes with the worksheet and I got almost identical results (WT 0.1165394 and Chi 89.391268 to Helmut). I have a spreadsheet in Excel that does the next steps (my results below). I did round my values to 4 decimals (should have used them all). H σ2D = ΣEQ + (ΣU)1/2 =0.06696. I can send you the spreadsheet if you like: it is set up like the Guidance and you can check-confirm the steps sequentially. Alternatively send me the values you have for the variance parameters (with you decimals) and I will calculate the final steps in the spreadsheet sequentially. If I assume your prior values are still current Par dfd Var(σ2) Cinv WT 69 0.116539674 89.39120787 WR 71 0.199313551 91.67023918 WI 67 0.165897781 49.16227018 Then I get: σ2D =σ2I-0.5*(σ2wt + σ2wr)= 0.00798 and H σ2D = ΣEQ + (ΣU)1/2 =0.06692 Note: my value for H1 (MI) is 0.11305 and uses CHi value of 49.16227018; it is smaller than your value since CHI (p=0.05) is 49.16227018 Angus |
jag009 ★★★ NJ, 2015-02-05 19:14 (3366 d 17:49 ago) (edited by jag009 on 2015-02-05 21:10) @ AngusMcLean Posting: # 14381 Views: 38,026 |
|
Thanks Angus, You finally crossed the finishing line! Here are my #s again:
Mine from SAS: My Chi square values were: Wi Wt Wr Can you present all your values (the above ones)? You value for Hσ2D=ΣEQ+(ΣU)1/2=0.06692 is still different from mine->0.0735826211My Chi Square equations: For i, cinv(0.05,df_it); X2alpha, n-s For t, cinv(1-0.05,dfdt); X21-alpha, n-s for r, cinv(1-0.05,dfdr); X21-alpha, n-s What values do you use for n-s? Our S2D more or less are the same, mine was 0.0079711687 Thanks John. P.S. Can you send me your spreadsheet? Interested in seeing how yours look like. You know how to email me right? |
AngusMcLean ★★ USA, 2015-02-06 02:39 (3366 d 10:24 ago) @ jag009 Posting: # 14382 Views: 38,108 |
|
❝ P.S. Can you send me your spreadsheet? Interested in seeing how yours look like. You know how to email me right? I will send you the spreadsheet tomorrow: the structure matches the Guidance document. Please check the H1 value in your work and my spreadsheet: that is the difference. Edit: Full quote removed. Please delete everything from the text of the original poster which is not necessary in understanding your answer; see also this post! [Helmut] |
jag009 ★★★ NJ, 2015-02-06 18:13 (3365 d 18:50 ago) @ AngusMcLean Posting: # 14383 Views: 37,859 |
|
Hi Angus, Something is wrong with your computation of H1 We agreed on MI=0.165897781, dfd=67, X2p=0.05,67=49.16227018 From your previous post: ❝ Note: my value for H1 (MI) is 0.11305 and uses CHi value of 49.16227018; it is smaller than your value since CHI (p=0.05) IS 49.16227018 From FDA's equation: H1=(n-s)*MI /X2alpha,n-s How did you end up with H1=0.11305? I reversed the equation using your H1 value to solve for (n-s) and ended up with (n-s)=33.5. Looks like your H1 equation is wrong (you used the H2 and H3 equations). FDA's H2 and H3 equations are the same except H2 uses MT and H3 uses MR H= -0.5*(n-s)*M# / X2alpha,n-s, where # = T or R If you use the correct H1 equation then H1=0.226091092, and the final HQ2D equals to my value of 0.073583 |
AngusMcLean ★★ USA, 2015-02-06 20:36 (3365 d 16:27 ago) @ jag009 Posting: # 14384 Views: 37,850 |
|
Thanks John: you are correct! the answer to your question is that I used a factor 0.5 in that equation {H1}. and that does not belong there as is evident in the source FDA Guidance. It is the other two parameters that are modified by the 0.5 multiplier. Now your data checks out on my spreadsheet, since as you say the final value is 0.07358 The Phoenix calculation of WT was done on an adhoc basis by modifying the existing code (WR). What should be done is that workflow(S) should be created such that both WT and WR can be obtained from the same workflow or workflows. That way you do not need to modify the WR code each time you need WT as well as WR. After it is very relevant to have a comparison of WT and WR when comparing two formulations. Therefore my spreadsheet now validates your calculations and I have corrected it. Edit: Full quote removed. Please delete everything from the text of the original poster which is not necessary in understanding your answer; see also this post! [Helmut] |
jag009 ★★★ NJ, 2015-02-17 23:49 (3354 d 13:14 ago) @ AngusMcLean Posting: # 14450 Views: 36,985 |
|
Angus, Just curious. Do you think we got the n-s term correctly identified? Is it different for MI, MT and MR or there should only be one (n-s)? Thanks John |
AngusMcLean ★★ USA, 2015-02-20 00:25 (3352 d 12:38 ago) @ jag009 Posting: # 14465 Views: 36,847 |
|
John: I have them different as you see below for my Excel spreadsheet: surely they are different, since the n number changes with each parameter? How about the Prof. in Toronto what did he say about that question? Paramater dfd Var (σ2) Cinv Angus |
jag009 ★★★ NJ, 2015-02-21 22:08 (3350 d 14:55 ago) @ AngusMcLean Posting: # 14487 Views: 36,798 |
|
Hi Angus, I won't be seeing him until March. I will definitely ask him about this. He took a look at the Proc Mixed G-Matrix route to get within subject T, R and I, but he didn't comment in details. He did say the #s can be obtained from that route. I just don't like the wordings FDA stated in the draft guidance. It looks like an open ended question -> "Q2D has an allowance of 0.03". They want us to present both Q2D and 95% UBC, but the ending statement seems to indicate that they want to see Q2D no more than 0.03 rather than both Q2D and 95% UCB no more than 0.03. They need some sort of English 101 training??? John |
AngusMcLean ★★ USA, 2015-02-22 18:25 (3349 d 18:38 ago) (edited by AngusMcLean on 2015-02-22 20:27) @ jag009 Posting: # 14488 Views: 36,657 |
|
John: I am not clear on the G matrix: At an earlier date I thought it was my salvation. I do not understand why it is not to be used. I do know that when I used the way I thought was appropriate it gave a ridiculous result. But then it is likely that I am not using in an appropriate way. Both Linda and Ana are aware where we are but have not commented on how to use the G matrix. I have emailed them; Linda has expressed an interest in the work you did with Helmut and this thread. Maybe the Prof. in Toronto can shed some light on this confusion: one thing I did do was create set of replicate ABAB MP data based on actual data I have and ran it in Phoenix. I have found it is difficult to fake suitable data so. It did run OK for bioequivalence, but as yet I have not pursued the steps 2, 3 and 4. The ad hoc changes I make on the existing template to give WT could be used. I need to figure out a way of modifying the template to have a standard template to produce WT and WR variance. It is difficult to do this if you did not write the code in the first place. Regarding the wording of the recent MP Guidance then what I see is the Guidance has many people reviewing and editing it sentence by sentence. It is confusing in the part you focus on. I think they are not real sure what position to take and they want to get the sponsors to produce data for them, so that after they have enough data then they can formulate a rational policy. As you have said yourself I do not see them failing a submission on that part of the Guidance. Angus |
jag009 ★★★ NJ, 2015-02-23 17:58 (3348 d 19:05 ago) @ AngusMcLean Posting: # 14489 Views: 36,589 |
|
Angus, That is my concern because it seems like (I have to revisit to confirm) the Wi obtained from G-Matrix is different from FDA's. ❝ Regarding the wording of the recent MP Guidance then what I see is the Guidance has many people reviewing and editing it sentence by sentence. It is confusing in the part you focus on. I think they are not real sure what position to take and they want to get the sponsors to produce data for them, so that after they have enough data then they can formulate a rational policy. As you have said yourself I do not see them failing a submission on that part of the Guidance. So sponsors will be guinea pigs! This will be big guinea pigs! John |
AngusMcLean ★★ USA, 2015-01-06 19:48 (3396 d 17:15 ago) @ Helmut Posting: # 14228 Views: 40,037 |
|
Helmut: my interest in Phoenix was to get within subject test variance (WT instead of WR. Following your suggestion I copied and pasted "Prepare data sets for analysis" to create a second one. Therefore I changed the code over in the second one from R2 and R1 to T2 and T1 as shown in the photo below: Subsequently there was little changes required {just to the text} to get WT] it seemed to work well: I note that the value obtained (0.1186) was slightly higher than John's value (0.11654). Of course the WR information is retained in the original "Prepare Data sets for analysis" I cannot see anything wrong; it seemed to be straight forward, but I am interested in why John got a different value in SAS. I think he experienced a problem doing this in Phoenix. I hope he see this. Angus |
jag009 ★★★ NJ, 2015-01-07 22:28 (3395 d 14:35 ago) @ AngusMcLean Posting: # 14241 Views: 40,031 |
|
Hi Angus, Sorry for M.I.A. Have a buffet full of crap to do here since I came back. Will get back to you before end of week? John P.S. Helmut! Had a good chat with Laszlo over dinner when I was in Toronto. He was on fire "literally" No we didn't drink Absinthe. |
Helmut ★★★ Vienna, Austria, 2015-01-08 17:10 (3394 d 19:53 ago) @ jag009 Posting: # 14256 Views: 39,926 |
|
Hi John ❝ Had a good chat with Laszlo over dinner when I was in Toronto. He was on fire "literally" I can imagine. He send me a copy of comments he filed together with László Tóthfalusi at the FDA. He further wrote: “I expect that, following its habits, FDA will not respond and will not move.” ❝ No we didn't drink Absinthe. — Dif-tor heh smusma 🖖🏼 Довге життя Україна! Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. 🚮 Science Quotes |
jag009 ★★★ NJ, 2015-01-08 18:32 (3394 d 18:31 ago) (edited by jag009 on 2015-01-08 19:49) @ Helmut Posting: # 14257 Views: 39,875 |
|
Hi Helmut, He basically said computation #2 is pointless at this stage. He said the 0.03 is NOT a criteria, it's an "allowance". In that case then Computation #2 is just for data collection... Also I brought up the S2D computation suggested by Ana (see my post to Angus in this thread) using ABE output (G-matrix). He said it is nonsense (I assume I interpreted the equations correctly to him. I did tell him her computation involves using between-subject variances). Nope, no Absinthe, just a glass of wine On another note, ever tried a liquor called "Ever Clear"? 190 Proof... John |
AngusMcLean ★★ USA, 2015-01-08 19:51 (3394 d 17:12 ago) @ jag009 Posting: # 14259 Views: 39,896 |
|
Hello John: I am confused by Ana's material. You may find it interesting to look at page 15, 16 of below lecture from a bio statistician. webarchive Angus |
jag009 ★★★ NJ, 2015-01-08 21:02 (3394 d 16:01 ago) @ AngusMcLean Posting: # 14260 Views: 39,731 |
|
Thanks Angus, Slides 15 and 16 are strange... 15 shows SD while 16 shows S2D. Typos? John |
AngusMcLean ★★ USA, 2015-01-08 21:27 (3394 d 15:36 ago) @ jag009 Posting: # 14261 Views: 39,760 |
|
❝ Slides 15 and 16 are strange... 15 shows SD while 16 shows S2D. Typos? I think the first one is the standard deviation and the second one is the variance. |
jag009 ★★★ NJ, 2015-01-08 21:33 (3394 d 15:30 ago) @ AngusMcLean Posting: # 14262 Views: 39,829 |
|
Oops, read too fast. I am not getting it. Okay, I am no Biostat expert... John |