AngusMcLean ★★ USA, 2014-01-02 20:02 (4127 d 14:45 ago) Posting: # 12128 Views: 8,235 |
|
I have BE results from a balanced 3 way crossover design: I am looking to get the (CV%) for intrasubject variance between formulation in the output. What I see is I have a tab for residual variance (residual) for AUC=0.0022. It seems that it is not included. Also I do not see any mention of intersubject variance term. Any comments? Angus Edit: Category changed. [Helmut] |
Helmut ★★★ ![]() ![]() Vienna, Austria, 2014-01-02 20:47 (4127 d 14:00 ago) @ AngusMcLean Posting: # 12129 Views: 7,356 |
|
Hi Angus, ❝ […] What I see is I have a tab for residual variance (residual) for AUC=0.0022. ❝ It seems that it is not included. Also I do not see any mention of intersubject variance term. Correct. All versions of WNL calculate CVs only for the 2×2 cross-over design (see the User’s Guide p322 = p352 of the PDF). For all other cross-over designs (assuming log-transformed analysis), you have to fire up the Data Wizard and set up two custom transformations:
CVs for parallel designs and higher-order cross-overs (including untransformed data) are on Pharsight’s agenda for the next release (planned for the first half of 2014). Checked in Phoenix Pre-Release V1.4 (Build 6.4.0.511, 2013-12-20): Not implemented yet… — Dif-tor heh smusma 🖖🏼 Довге життя Україна! ![]() Helmut Schütz ![]() The quality of responses received is directly proportional to the quality of the question asked. 🚮 Science Quotes |
AngusMcLean ★★ USA, 2014-01-02 23:53 (4127 d 10:54 ago) @ Helmut Posting: # 12130 Views: 7,253 |
|
Thank you Helmut: I am familiar with the data Wizard for transformation. But precisely what information is transformed and where is the location of the data columns for the transformations. Please can you advise, Angus |
Helmut ★★★ ![]() ![]() Vienna, Austria, 2014-01-03 01:21 (4127 d 09:26 ago) @ AngusMcLean Posting: # 12131 Views: 7,336 |
|
Hi Angus, ❝ […] But precisely what information is transformed and where is the location of the data columns for the transformations. Oops, terribly sorry! I rarely calculate CVinter from a cross-over – I have my validated templates ready, which I avoid to touch. ![]() Final Variance Parameters -tab.
Join both Results , Sort by Dependent , map as source only the respective CVs (getting rid of the variances):![]() Here you are: ![]() If you want you can join it with the Average BE results (mapping only relevant variables) getting: ![]() It’s easy now to come up with a nice table for the report (FDA’s rounding to two decimal places of the PE/CI, four significant digits of the CVs): ![]() Hope that helps. This was a pilot study with similarly low CVintra than yours. I greyed out the products. Sorry for the confusion caused in my OP. BTW, concerning our last week’s conversation on Pharsight’s Extranet: Below an example of scheduled/actual times and concentrations. Generally six to eight subjects and up to three treatments fit on a single page in portrait orientation. ![]() Happy coding! — Dif-tor heh smusma 🖖🏼 Довге життя Україна! ![]() Helmut Schütz ![]() The quality of responses received is directly proportional to the quality of the question asked. 🚮 Science Quotes |
AngusMcLean ★★ USA, 2014-01-03 17:25 (4126 d 17:22 ago) @ Helmut Posting: # 12133 Views: 7,182 |
|
Thanks Helmut: I am learning "filtration" in Pharsight's program so I will try this routine. I think I follow for the most part. Regarding actual times and nominal I can see why if you work in the generic area that you need structure to be efficient. Recently I got some actual times from clinical data: the information was transcribed over from the case report forms and then put into Excel. I had to write Excel routines to subtract times to get the actual times at each time point. This was a Phase 2 study so the adherence to stipulated times was not good. Yet at the end of the day the data analysis showed no significant differences between the AUC parameters. They were almost the same so was the CV(%). Angus |
Helmut ★★★ ![]() ![]() Vienna, Austria, 2014-01-03 18:47 (4126 d 16:00 ago) @ AngusMcLean Posting: # 12135 Views: 7,178 |
|
Hi Angus, ❝ […] I can see why if you work in the generic area that you need structure to be efficient. Not only generics… I only try to be efficient in order to have spare time for the more interesting stuff. ![]() ❝ Recently I got some actual times from clinical data: the information was transcribed over from the case report forms and then put into Excel. I had to write Excel routines to subtract times to get the actual times at each time point. This was a Phase 2 study so the adherence to stipulated times was not good. Oh yes. Nasty in Excel. I implemented these routines in PHX – not much nicer. Life would be much easier if software developers (and guys setting up CRFs and clinical data management systems as well) would implement ISO 8601 completely. Both Excel and Phoenix use a “crippled” version. Did you ever transfer an Excel-sheet containing timestamps from a PC to a Mac? Whereas on Windows the timebase is 1900-01-01 00:00:00 , the default on a Mac it is 1904-01-01 00:00:00 . The difference is four years, but 366+365+365+365 days. 1900 was a leap year with a Feb. 29th… Better to change the timebase before. For a funny story see the end of this post.❝ Yet at the end of the day the data analysis showed no significant differences between the AUC parameters. They were almost the same so was the CV(%). Yep, it rarely (if ever) matters. I have uses scheduled times for almost two decades. As long as you are consistent it should be OK. But at least the EMA and WHO clearly stated that they prefer actual times – which makes sense. P.S.: I just tested the latest Pre-Release of PHX1.4. The CVs for higher-order Xovers are not implemented yet. — Dif-tor heh smusma 🖖🏼 Довге життя Україна! ![]() Helmut Schütz ![]() The quality of responses received is directly proportional to the quality of the question asked. 🚮 Science Quotes |
SDavis ★★ ![]() UK, 2014-02-20 00:43 (4079 d 10:04 ago) @ Helmut Posting: # 12459 Views: 6,685 |
|
Helmut, ❝ ... Nasty in Excel. I don't believe, nor does Phoenix ;0) that 1900 was a leap year; https://en.wikipedia.org/wiki/Leap_year although I know Excel does think that, I can't recall whether SAS is correctly presenting it; I am sure it does. Simon — Simon Senior Scientific Trainer, Certara™ [link=https://www.youtube.com/watch?v=xX-yCO5Rzag[/link] https://www.certarauniversity.com/dashboard https://support.certara.com/forums/ |
Helmut ★★★ ![]() ![]() Vienna, Austria, 2014-02-20 02:03 (4079 d 08:44 ago) @ SDavis Posting: # 12460 Views: 6,751 |
|
Hi Simon, ❝ ❝ ... Nasty in Excel. ❝ ❝ I don't believe, nor does Phoenix ;0) that 1900 was a leap year; https://en.wikipedia.org/wiki/Leap_year Oops – THX for brushing up my knowledge (https://en.wikipedia.org/wiki/Century_leap_year). ❝ although I know Excel does think that, I can't recall whether SAS is correctly presenting it; I am sure it does. According to the manual: SAS date value is a value that represents the number of days between January 1, 1960, and a specified date. SAS can perform calculations on dates ranging from A.D. 1582 to A.D. 19,900. Dates before January 1, 1960, are negative numbers; dates after are positive numbers.
— Dif-tor heh smusma 🖖🏼 Довге життя Україна! ![]() Helmut Schütz ![]() The quality of responses received is directly proportional to the quality of the question asked. 🚮 Science Quotes |
AngusMcLean ★★ USA, 2014-01-03 19:21 (4126 d 15:26 ago) @ Helmut Posting: # 12136 Views: 7,147 |
|
Helmut: Thank you it works as you say .....you can see why folks do not seek to include parallel group designs in their study plans. I have ~ 35% for intersubject variability, but ~ 15% for intrasubject variability. Angus |