AngusMcLean
★★  

USA,
2014-01-02 20:02
(4128 d 00:58 ago)

Posting: # 12128
Views: 8,237
 

 PHOENIX 6.3 and intrasubject variance [Software]

I have BE results from a balanced 3 way crossover design: I am looking to get the (CV%) for intrasubject variance between formulation in the output. What I see is I have a tab for residual variance (residual) for AUC=0.0022.

It seems that it is not included. Also I do not see any mention of intersubject variance term.

Any comments?

Angus


Edit: Category changed. [Helmut]
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2014-01-02 20:47
(4128 d 00:13 ago)

@ AngusMcLean
Posting: # 12129
Views: 7,358
 

 PHX: CVs only for 2×2

Hi Angus,

❝ […] What I see is I have a tab for residual variance (residual) for AUC=0.0022.

❝ It seems that it is not included. Also I do not see any mention of intersubject variance term.


Correct. All versions of WNL calculate CVs only for the 2×2 cross-over design (see the User’s Guide p322 = p352 of the PDF). For all other cross-over designs (assuming log-transformed analysis), you have to fire up the Data Wizard and set up two custom transformations:
  • New Column Name: CVintra
    Formula:         sqrt(exp(Residual_Variance) - 1)

  • New Column Name: CVinter
    Formula:         sqrt(exp(Var(Subj(Seq)) - 1)
Next I would add a filter to get rid of stuff you don’t need in the output.

CVs for parallel designs and higher-order cross-overs (including untransformed data) are on Pharsight’s agenda for the next release (planned for the first half of 2014). I will check this week whether something is already implemented in the current beta-version.


Checked in Phoenix Pre-Release V1.4 (Build 6.4.0.511, 2013-12-20): Not implemented yet…

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
AngusMcLean
★★  

USA,
2014-01-02 23:53
(4127 d 21:07 ago)

@ Helmut
Posting: # 12130
Views: 7,255
 

 PHX: CVs only for 2×2

Thank you Helmut: I am familiar with the data Wizard for transformation. But precisely what information is transformed and where is the location of the data columns for the transformations.


Please can you advise,


Angus
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2014-01-03 01:21
(4127 d 19:40 ago)

@ AngusMcLean
Posting: # 12131
Views: 7,338
 

 PHX: Example

Hi Angus,

❝ […] But precisely what information is transformed and where is the location of the data columns for the transformations.


Oops, terribly sorry! I rarely calculate CVinter from a cross-over – I have my validated templates ready, which I avoid to touch. :cool: Actually you need two DWs, each consisting of both a filter and the transformation. In the BE results, navigate to the Final Variance Parameters-tab.
    Data Wizard 1
  • FilterBuilt InAddInclude: Var(Residual)Column Parameter
    Alternatively:Custom (Include)AddParameter='Var(Residual)'
  • TransformationTransformation Type: CustomNew Column Name: CVintra
    Formula: 100*sqrt(exp(Estimate) - 1)
    Data Wizard 2
  • FilterBuilt InAddInclude: Var(Sequence*Subject)Column Parameter
    Alternatively:Custom (Include)AddParameter='Var(Sequence*Subject)'
  • TransformationTransformation Type: CustomNew Column Name: CVinter
    Formula: 100*sqrt(exp(Estimate) - 1)
Now Join both Results, Sort by Dependent, map as source only the respective CVs (getting rid of the variances):

[image]
Here you are:

[image]

If you want you can join it with the Average BE results (mapping only relevant variables) getting:

[image]

It’s easy now to come up with a nice table for the report (FDA’s rounding to two decimal places of the PE/CI, four significant digits of the CVs):

[image]

Hope that helps. This was a pilot study with similarly low CVintra than yours. I greyed out the products.
Sorry for the confusion caused in my OP.


BTW, concerning our last week’s conversation on Pharsight’s Extranet: Below an example of scheduled/actual times and concentrations. Generally six to eight subjects and up to three treatments fit on a single page in portrait orientation.

[image]

Happy coding!

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
AngusMcLean
★★  

USA,
2014-01-03 17:25
(4127 d 03:35 ago)

@ Helmut
Posting: # 12133
Views: 7,184
 

 PHX: Example

Thanks Helmut: I am learning "filtration" in Pharsight's program so I will try this routine. I think I follow for the most part.

Regarding actual times and nominal I can see why if you work in the generic area that you need structure to be efficient. Recently I got some actual times from clinical data: the information was transcribed over from the case report forms and then put into Excel. I had to write Excel routines to subtract times to get the actual times at each time point. This was a Phase 2 study so the adherence to stipulated times was not good. Yet at the end of the day the data analysis showed no significant differences between the AUC parameters. They were almost the same so was the CV(%).


Angus
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2014-01-03 18:47
(4127 d 02:14 ago)

@ AngusMcLean
Posting: # 12135
Views: 7,180
 

 PHX: Example

Hi Angus,

❝ […] I can see why if you work in the generic area that you need structure to be efficient.


Not only generics… I only try to be efficient in order to have spare time for the more interesting stuff. ;-)

❝ Recently I got some actual times from clinical data: the information was transcribed over from the case report forms and then put into Excel. I had to write Excel routines to subtract times to get the actual times at each time point. This was a Phase 2 study so the adherence to stipulated times was not good.


Oh yes. Nasty in Excel. I implemented these routines in PHX – not much nicer. Life would be much easier if software developers (and guys setting up CRFs and clinical data management systems as well) would implement ISO 8601 completely. Both Excel and Phoenix use a “crippled” version. Did you ever transfer an Excel-sheet containing timestamps from a PC to a Mac? Whereas on Windows the timebase is 1900-01-01 00:00:00, the default on a Mac it is 1904-01-01 00:00:00. The difference is four years, but 366+365+365+365 days. 1900 was a leap year with a Feb. 29th… Better to change the timebase before. For a funny story see the end of this post.

❝ Yet at the end of the day the data analysis showed no significant differences between the AUC parameters. They were almost the same so was the CV(%).


Yep, it rarely (if ever) matters. I have uses scheduled times for almost two decades. As long as you are consistent it should be OK. But at least the EMA and WHO clearly stated that they prefer actual times – which makes sense.

P.S.: I just tested the latest Pre-Release of PHX1.4. The CVs for higher-order Xovers are not implemented yet.

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
SDavis
★★  
Homepage
UK,
2014-02-20 00:43
(4079 d 20:17 ago)

@ Helmut
Posting: # 12459
Views: 6,687
 

 PHX: Example

Helmut,

❝ ... Nasty in Excel.


I don't believe, nor does Phoenix ;0) that 1900 was a leap year; https://en.wikipedia.org/wiki/Leap_year

although I know Excel does think that, I can't recall whether SAS is correctly presenting it; I am sure it does.

Simon

Simon
Senior Scientific Trainer, Certara™
[link=https://www.youtube.com/watch?v=xX-yCO5Rzag[/link]
https://www.certarauniversity.com/dashboard
https://support.certara.com/forums/
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2014-02-20 02:03
(4079 d 18:57 ago)

@ SDavis
Posting: # 12460
Views: 6,753
 

 Century leap years

Hi Simon,

❝ ❝ ... Nasty in Excel.


❝ I don't believe, nor does Phoenix ;0) that 1900 was a leap year; https://en.wikipedia.org/wiki/Leap_year


Oops – THX for brushing up my knowledge (https://en.wikipedia.org/wiki/Century_leap_year).

❝ although I know Excel does think that, I can't recall whether SAS is correctly presenting it; I am sure it does.


According to the manual:

SAS date value

is a value that represents the number of days between January 1, 1960, and a specified date. SAS can per­form calculations on dates ranging from A.D. 1582 to A.D. 19,900. Dates before January 1, 1960, are negative numbers; dates after are positive numbers.

  • SAS date values account for all leap year days, including the leap year day in the year 2000.

  • SAS date values can reliably tell you what day of the week a particular day fell on as far back as September 1752, when the calendar was adjusted by dropping several days. SAS day-of-the-week and length-of-time calculations are accurate in the future to A.D. 19,900.

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
AngusMcLean
★★  

USA,
2014-01-03 19:21
(4127 d 01:39 ago)

@ Helmut
Posting: # 12136
Views: 7,149
 

 PHX: Example

Helmut: Thank you it works as you say .....you can see why folks do not seek to include parallel group designs in their study plans. I have ~ 35% for intersubject variability, but ~ 15% for intrasubject variability.

Angus
UA Flag
Activity
 Admin contact
23,424 posts in 4,927 threads, 1,671 registered users;
22 visitors (0 registered, 22 guests [including 4 identified bots]).
Forum time: 22:01 CEST (Europe/Vienna)

The true Enlightenment thinker, the true rationalist,
never wants to talk anyone into anything.
No, he does not even want to convince;
all the time he is aware that he may be wrong.    Karl R. Popper

The Bioequivalence and Bioavailability Forum is hosted by
BEBAC Ing. Helmut Schütz
HTML5