Aceto81
★    

Belgium,
2008-06-26 18:31
(6012 d 21:50 ago)

Posting: # 1973
Views: 38,275
 

 Calculation of time above MIC [🇷 for BE/BA]

Hi all,

I'm working at a small pharmaceutical company, and I'm doing my first (small) steps in the field of PK/PD.

I recieved an article from a collegue, which I should be able to reproduce
(article: PK and PK/PD of doxycycline in drinking water after therapeutic use in pigs, J. vet. Pharmacol. Therap. 28, 525-530, 2005.)

Cmax, Cmin, AUC, AUCss, I'm able to reproduce, but I'm struggling with the time above the MIC90 and %time above MIC90.

for example: if I have a MIC90 for Pasteurella multocida of 0.517µg/ml
and the following data:
 time  Subj   conc
  0.5   1     0.64
  1     1     0.7
  3     1     1.39
  5     1     1.35
  8     1     0.67
 12     1     1.48
 24     1     0.32
 36     1     2.08
 48     1     0.87
 60     1     0.93
 72     1     1.09
 84     1     0.59
 96     1     1.01
108     1     0.47
120     1     0.99
122     1     0.62
124     1     0.13


Prats et al. came up with a time of 102.5 hours and 94.9% of total time in SS.

I have no idea about how to find the 102.5 hours.
I searched in the forum's archive, in the manual of WinNonLin, on the web, but couldn't find any references.

Maybe someone can give me some help?
I'm routinely use R, but now I'm also using WinNonLin.


Thanks in advance

Ace

--
Edit: Category changed; table reformated using BBCodes. [Helmut]
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2008-06-26 20:58
(6012 d 19:22 ago)

@ Aceto81
Posting: # 1975
Views: 34,940
 

 Calculation of time above MIC

Dear Aceto81!

❝ ...I'm struggling with the time above the MIC90 and %time above MIC90.

❝ Prats et al. came up with a time of 102.5 hours and 94.9% of total time in SS.


Looking at you data, this seems to be a reasonable value.

❝ I have no idea about how to find the 102.5 hours.

❝ I searched in the forum's archive, in the manual of WinNonLin, on the web, but couldn't find any references.


It's not implemented in WinNonlin; but I'm right now at a conference in London where one of the lecturers is Jason Chittenden of Pharsight - I will ask him.

I don't know any reference either - except one by Jerome Skelly of the FDA, maybe 20 years ago. Start searching with the term "occupany time" or just "occupancy".

❝ Maybe someone can give me some help?

❝ I'm routinely use R, but now I'm also using WinNonLin.


If the values fall below the MIC and come up again (like in your case) do a linear interpolation. An algorithm is like this:
  • Run a loop through the sampling time points
  • If two subsequent concentrations include the MIC and the second one is above, tag an "up" marker
  • If two subsequent concentrations include the MIC and the second one is below, tag a "down" marker
  • If two subsequent concentrations are >MIC add the time interval to a stack
  • If two subsequent concentrations are <MIC goto next concentration
  • For the "up" markers perform a linear interpolation to find out where the "line" intersects the MIC; subtract this time value from the next time point and add it to the stack
  • For the "down" markers perform a linear interpolation to find out where the "line" intersects the MIC; add this time value to the last time point and add it to the stack
  • Occupancy time = the value of the stack
I would use rather R than WinNonlin for that purpose, although it shouldn't be that difficult to set it up too.

I just made a unchecked hotel-room quickshot in R; 116.5 hr >0.517 ug/ml within 0 - 124 hr?!!

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
Aceto81
★    

Belgium,
2008-06-26 23:50
(6012 d 16:30 ago)

@ Helmut
Posting: # 1976
Views: 34,694
 

 Calculation of time above MIC

Dear Helmut,

❝ Actually just start at the beginning of the profile and add all segments which are abobe the MIC up. If the values fall below the MIC and come up again (like in your case) do a linear interpolation. I would use rather R than WinNonlin for that purpose.


Probably I was doing something wrong, maybe sitting too long after the PC, but when I tried this approach, I wasn't even near....

But anyway, with your reply, I started over and finally it works!

Thanks for your help (again)

Ace
PS: for anyone who may find it interesting:

The R code for finding time below the threshold:
"dat" is a dataframe containing column "conc", and "time"
th = threshold
f <- function(dat, th = 0.517) {
  under <- 0
    for (i in which(dat$conc < th)) {
      if (!is.na(dat$conc[i-1])) {
        y <- dat$conc[c(i-1, i)]
        x <- dat$time[c(i-1, i)]
        slope <- coef(lm(y ~ x))[2]
        under <- under + x[2]-x[1]-(th-y[1])/slope
      }
      if (!is.na(dat$conc[i+1])) {
        y <- dat$conc[c(i, i+1)]
        x <- dat$time[c(i, i+1)]
        slope <- coef(lm(y ~ x))[2]
        under <- under + (th-y[1])/slope
      }
    }
  return(under)
}



Edit: Reformatted using BBCodes. [Helmut]
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2008-06-27 02:18
(6012 d 14:02 ago)

@ Aceto81
Posting: # 1977
Views: 34,992
 

 Calculation of time above MIC

Dear Ace,

I’m too tired to fire up R and check your code, but here’s a second quickshot in Excel – I didn’t save my R-code and wanted to get an independent solution.
We have three segments in the profile above the threshold (I inserted conc=0 at t=0).
Segment 1:   0.404 →  21.96 (delta  21.56)
Segment 2:  25.34  → 106.96 (delta  81.61)
Segment 3: 109.08  → 122.42 (delta  13.34)
                             total 116.51…

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2008-06-29 04:24
(6010 d 11:56 ago)

@ Aceto81
Posting: # 1984
Views: 34,691
 

 Calculation of time above MIC

Dear Ace,

❝ Prats et al. came up with a time of 102.5 hours and 94.9% of total time in SS.


I overlooked this hint. The end of the last dosage interval therefore is 102.5/.949 = 108 hours, which is strange. I would set it to the time of the last dose plus one dosage interval (tau). If tau=12h, then the end should be set to 120h.

❝ But anyway, with your reply, I started over and finally it works!


OK, checked it again, this time with the last time point 108 hours (later samples dropped):
time <- c(0.5,1,3,5,8,12,24,36,48,60,72,84,96,108)
conc <- c(0.64,0.7,1.39,1.35,0.67,1.48,0.32,2.08,0.87,0.93,1.09,
          0.59,1.01,0.47)
dat  <- data.frame(time, conc)

f <- function(dat, th) {
  under <- 0
    for (i in which(dat$conc < th)) {
      if (!is.na(dat$conc[i-1])) {
        y <- dat$conc[c(i-1,i)]
        x <- dat$time[c(i-1,i)]
        slope <- coef(lm(y~x))[2]
        under <- under + x[2]-x[1]-(th-y[1])/slope
      }
      if (!is.na(dat$conc[i+1])) {
        y <- dat$conc[c(i,i+1)]
        x <- dat$time[c(i,i+1)]
        slope <- coef(lm(y~x))[2]
        under <- under + (th-y[1])/slope
      }
    }
  return(under)
}

th        <- 0.517
last      <- 108
occupancy <- last - f(dat, th)
coverage  <- 100*occupancy/last
cat(" End of last dosing interval:",last,"\n",
    "Occupancy time:",occupancy,"\n",
    "Coverage:",coverage,"%\n")


Now I’m getting 103.6 hours (Coverage 95.9%) in agreement with Excel and even a manual calculation.
I have no idea how the reference’s results (102.5 hours/ 94.9%) were obtained (last 120 h yields in 114.5 h / 95.4%). :confused:
The R code right now needs a little cosmetics for t=0/c=0 (returns 'error in if (!is.na(dat$conc[i - 1])) { : Argument has lenght 0') - although the result is still correct.

BTW, I looked the reference up; I’m afraid it will not be very helpful:

Skelly JP. Issues and controversies involving controlled-release drug product studies. Pharmacy International. Nov. 1986: 280–6.


Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
Aceto81
★    

Belgium,
2008-06-30 17:50
(6008 d 22:30 ago)

@ Helmut
Posting: # 1988
Views: 35,097
 

 Calculation of time above MIC

Hi HS,

the time above MIC was calculated during the Steady State, which goes from 12 hours to 120 hours. That's why the results differs. (I was struggling with this part as well :ponder: ).

Here the updated code:
time <- c(0.5, 1, 3, 5, 8, 12, 24, 36, 48, 60,
          72, 84, 96, 108, 120, 122, 124)
conc <- c(0.64, 0.7, 1.39, 1.35, 0.67, 1.48, 0.32, 2.08, 0.87, 0.93,
          1.09, 0.59, 1.01, 0.47, 0.99, 0.62, 0.13)
dat  <- data.frame(time, conc)

f <- function(dat, th) {
  under <- 0
  for (i in which(dat$conc < th)) {
      if (!is.na(dat$conc[i-1])) {
        y <- dat$conc[c(i-1,i)]
        x <- dat$time[c(i-1,i)]
        slope <- coef(lm(y~x))[2]
        under <- under + x[2]-x[1]-(th-y[1])/slope
      }
      if (!is.na(dat$conc[i+1])) {
        y <- dat$conc[c(i,i+1)]
        x <- dat$time[c(i,i+1)]
        slope <- coef(lm(y~x))[2]
        under <- under + (th-y[1])/slope
      }
    }
  return(under)
}

th    <- 0.517
first <- 12
last  <- 120
occupancy <- last-first - f(dat[dat$time >=first & dat$time <=last, ], th)
coverage  <- 100*occupancy/(last-first)
cat(" Time point interval:",first,"-",last,"\n",
    "Occupancy time:",occupancy,"\n",
    "Coverage:",coverage,"%\n")

Thanks for your valuable input

Ace
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2008-06-30 18:45
(6008 d 21:35 ago)

@ Aceto81
Posting: # 1989
Views: 35,789
 

 Therapeutic Occupancy Time / MEC

Dear Ace,

OK, now everything’s clear. I like the way you set the limits – I would have used brute force and removed the datapoints from the data.frame… ;-)

In case you need the reference – it’s only one paragraph and a figure:

Occupancy time
  Where a therapeutic window exists, ‘Therapeutic Occupancy Time’1, the time that the plasma concentration stays within the therapeutic range, becomes an important criterion. In steady-state, the percentage of time the drug concentrations lies within the therapeutic window is important. In the cited example, the drug lies within the therapeutic window for the subject population about 80% of the time (Fig. 4).
[image]

The figure is a nice illustration of the interpolation/intersection.
It should be noted that there’s also an upper limit, which complicates things: two formulations may have identical occupancy times from differently shaped curves! If not only the MIC but also toxicity is an issue, I would not suggest using the Occupancy Time without a thorough inspection of individual profiles.

Another one, mixing up concentration (PK) with effect (PD) a little bit:2
[image]


  1. Skelly JP, Barr WH. Biopharmaceutic considerations in designing and evaluating novel drug delivery systems. Clin Res Pract Drug Reg Aff. 1985;3(4):501–39. doi:10.3109/10601338509051086.
    [image]
  2. Goodman & Gilman’s The Pharmacological Basis of Therapeutics. McGraw-Hill; 11th ed. 2006. p.19.

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2008-08-08 22:38
(5969 d 17:42 ago)

@ Helmut
Posting: # 2164
Views: 35,858
 

 Time above MIC (WinNonlin 5.2.1)

Dear Ace!

❝ ❝ I searched in […] the manual of WinNonLin, but couldn't find any references.


❝ It's not implemented in WinNonlin; but I'm right now at a conference in London where one of the lecturers is Jason Chittenden from Pharsight - I will ask him.


OK, since Pharsight invited my to become a WinNonlin 6 beta tester, I’m just digging a little bit deeper into this stuff.
I must correct myself – it’s already there, but until today I didn’t find it myself (in the online help: Noncompartmental Analysis > Therapeutic response).
From your workbook, open NCA model 200. After performing the usual steps (dragging time and concentration to the respective fields, entering dose,…) goto Model > Therapeutic Response…; in the tab Therapeutic Response enter 0.517 to the field Lower. Click OK and run the calculation. You find the time above MIC (0.517) in the new workbook in the field TimeHgh.
Now for the surprise:
If the workbook contains only data from 12–120h, WinNonlin comes up with 109.3966 (and TimeLow 10.6034) which adds up nicely to 120h, but nobody asked for it. Next I added a new time-column subtracting 12h from the original ones (i.e., running from 0–108h). Results: TimeHgh 101.5885 and TimeLow 6.4115…
Again not the 102.5h we would expect from the reference, your wonderful R-code and my quickshot in Excel. :confused:
Next I suspected some kind of interpolation issue, because I’ve set my default in WinNonlin to lin/log interpolation Tools > Options > Models > Default Parameter Options > NCA calculation method > Linear Trapezoidal (Linear/Log Interpolation). Changing to Linear Trapezoidal (Linear Interpolation) gave the same result.
I don’t know how the result is calculated…
Next step:
 x | y
---+---
 0 | 0
 1 | 1
 2 | 0

A simple triangle – MIC set to 0.5; expected t≥0.5 = 1 → Bingo!
So maybe it’s a problem with adding segments:
 x | y
---+---
 0 | 0
 1 | 1
 2 | 0
 3 | 1
 4 | 0

Expected 2, reported 2…
I checked some of my old datasets and always got differences to WinNonlin’s results (never more than 5%, but also no agreement in a single case).
Another example:
  t     |  C
--------+------
 0.0000 |  BQL
 0.5167 |  BQL
 1.0000 | 1.144
 1.5000 | 2.399
 2.0000 | 3.226
 2.5000 | 3.236
 3.0000 | 2.943
 3.5000 | 2.776
 4.0000 | 3.393
 4.5000 | 4.536
 5.0000 | 3.934
 6.0000 | 3.387
 9.0000 | 1.643
12.0333 | 0.717
16.0167 | 0.231
24.0500 |  BQL

Don’t be shocked about the profile – that’s a formulation with two absorption phases. I wanted to calculate the Half Value Duration (time interval where C ≥ 50% Cmax, aka HVD). Unfortunately it’s not possible to enter a formula in the respective column – in column B there’s the value of Cmax, but entering =B1*0.5 to cell C1 gives Must enter numeric value OK :-D.
Therefore, the calculation has to be done somewhere else… In the example Cmax/2=2.268.
Now let’s do it the hard way: the first intersection is between 1 and 1.5 [1.144<2.268<2.399], the second one between 6 and 9 [3.387>2.268>1.643].
No fiddling around with linear regression, just a plain linear interpolation (paper, pencil, brain).
The first intersection is 1.5+(1–1.5)×(2.268–2.399)/(1.144–2.399)=1.4478, the second one 9+(6–9)×(2.268–1.643)/(3.387–1.643) = 7.9249. HVD = 7.9249–1.4478 = 6.4771.
WinNonlin comes up with 6.2153. :angry:
Now I got the idea that WinNonlin might interpolate logarithmically in the decreasing part 9+(6–9)×(log(2.268)–log(1.643))/(log(3.387)-log(1.643)) = 9+(6–9)×(0.8189–0.49652)/(1.21994–0.49652) = 7.6631. HVD = 7.6631–1.4478 = 6.2153; Q.E.D.![image]
Actually this algorithm is reasonable (following the same logic as the lin-up/log-down option in AUC calculation) – but it’s not documented, and I couldn’t find a way to change this setting.

[image]The green line shows the intersection in linear scale (conventional method).


In logarithmic scale it’s clear that WinNonlin’s method (blue line) intersects the curve just at the right spot. ;-)


IMHO the method is nice, if stated in the protocol – and you don’t have to struggle in recalculating old studies or comparing your data with the literature – which may drive you nuts.
[image]

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
Aceto81
★    

Belgium,
2008-08-12 11:56
(5966 d 04:24 ago)

@ Helmut
Posting: # 2169
Views: 34,949
 

 Time above MIC (WinNonlin 5.2.1)

Wow Helmut,

you did a hell of a job.
Great thinking!

Do you think that the log/linear approach for the descending part is better than the classic linear approach?
As you already stated, the elimination phase is log/linear, so maybe this is more correct than the 'classic' approach.

Or maybe we can choose, as long as we clearly report which method we used?

Best regards

Ace
PS: anyone interested in the full R code with an extension to choose log or not?
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2008-08-12 14:32
(5966 d 01:48 ago)

@ Aceto81
Posting: # 2170
Views: 36,483
 

 Time above MIC (WinNonlin 5.2.1)

Dear Ace!

❝ you did a hell of a job.

❝ Great thinking!


To be honest, it was more of try-and-error than thinking. Reading the online-help (and better: the User’s Guide) was just a little helpful. Only once I found out what WinNonlin was doing, I could understand the explanations – I’m getting old…

❝ Do you think that the log/linear approach for the descending part is better than the classic linear approach?

❝ As you already stated, the elimination phase is log/linear, so maybe this is more correct than the 'classic' approach.


Yes, I would think so – especially if time points are rather ‘far’ apart.

❝ Or maybe we can choose, as long as we clearly report which method we used?


According to WinNonlin’s User’s Guide (page 186, PDF page 204) both options should be possible to choose from; but whatever I tried always ended up lin/log…
Another nuisance: for MR formulations it would be nice to calculate the time interval, where C≥50%Cmax (Half Value Duration, HVD) or C≥75%Cmax (Plateau Time, t75%). In the ‘Therapeutic response windows’ tab WinNonlin shows in one column individual’s Cmax, but does not allow to enter a formula in the next column. :-( So it's time for copy-and-paste…
For my two-segment absorption example it’s even worse: If I’m interested in the first and second part (like using WinNonlin’s partial area method) nothing helps but deleting the respective time points from the workbook.
If you just exclude them, WinNonlin will still present the global Cmax only. :confused:

Yesterday I downloaded Phoenix WinNonlin 6 ‘beta’ and will have a look how they ‘do’ it there. I didn’t dare to install the 156MB yet (really lots of stuff: needs M$ Visual C++ compiler and Office 2003, includes all successors of the recent version: WinNonlin, WinNonMix, IVIVC Toolkit, a new graphical engine [yes, two independent y-axes, Trellis plots!], a graphical model builder [similar to KINETICA], and, and…).

❝ PS: anyone interested in the full R code with an extension to choose log or not?


Yes, me. :-D
I gave it a short try with a start time = 0, and got the index error again. If you go with a log-version you probably must add small dummy concentrations for values below the LLOQ.

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
Aceto81
★    

Belgium,
2008-08-13 13:19
(5965 d 03:01 ago)

@ Helmut
Posting: # 2176
Views: 34,621
 

 Time above MIC (R function)

Dear Helmut,

I changed the function radically, so here is version 2, with the data you provided (BLQ was changed to NA):

dat <- structure(list(time = c(0, 0.5167, 1, 1.5, 2, 2.5, 3, 3.5, 4,
                               4.5, 5, 6, 9, 12.0333, 16.0167, 24.05),
                      conc = c(NA, NA, 1.144, 2.399, 3.226, 3.236, 2.943,
                               2.776, 3.393, 4.536, 3.934, 3.387, 1.643,
                               0.717, 0.231, NA)),
                      .Names = c("time", "conc"),
                      class = "data.frame", row.names = c(NA, -16L))
f2 <- function(dat,th,logarithmic=FALSE) {
  above = 0
  w <- dat$conc > th
  w[w == FALSE] <- 0
  w[w == TRUE] <- 1
  w2 <- which(abs(diff(w)) == 1)
  if (logarithmic == FALSE) {
    for (i in w2) {
      n1 <- diff(w)[i] * -1*(dat$time[i+1] + ((dat$time[i]-dat$time[i+1]) *
            (th-dat$conc[i+1])/(dat$conc[i]-dat$conc[i+1])))
      above <- above + n1
    }
  }
  if (logarithmic == TRUE) {
    for (i in w2) {
      if (diff(w)[i] == 1) {
        n1 <- (dat$time[i+1] + ((dat$time[i]-dat$time[i+1]) *
              (th-dat$conc[i+1])/(dat$conc[i]-dat$conc[i+1])))
        above <- above - n1
      }
      if (diff(w)[i] == -1) {
        n1 <- (dat$time[i+1] + ((dat$time[i]-dat$time[i+1]) *
              (log(th)-log(dat$conc[i+1]))/(log(dat$conc[i])-log(dat$conc[i+1]))))
        above <- above + n1
      }
    }
  }
  return(above)
}

f2(dat, th = 2.268, logarithmic = TRUE)


Best regards

Ace
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2008-08-13 15:18
(5965 d 01:02 ago)

@ Aceto81
Posting: # 2177
Views: 34,818
 

 Time above MIC (R function)

Dear Ace!

Thanks for you efforts! I’m a little bit short in time, so expect testing from my side later. :-(
I think a critical point are intersections between time points where one is >LLOQ and the other is <LLOQ. WinNonlin has some kind of homebrew for it…

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
Aceto81
★    

Belgium,
2008-08-13 17:35
(5964 d 22:45 ago)

@ Helmut
Posting: # 2181
Views: 34,705
 

 Time above MIC (R function)

Dear HS

❝ Thanks for you efforts! I’m a little bit short in time, so expect testing from my side later. :-(


No problem, I'm short in time too, but if you find some bugs, let me know and I will fix it as soon as possible.

❝ I think a critical point are intersections between time points where one is >LLOQ and the other is <LLOQ. WinNonlin has some kind of homebrew for it…


I'm not sure about how I should interpret this, if you can give an example that should be great.
But if you let me know about how to treat this things, I'm sure I can put it in the function.


Ace
SDavis
★★  
Homepage
UK,
2009-06-04 12:55
(5670 d 03:25 ago)

@ Helmut
Posting: # 3816
Views: 34,991
 

 Phoenix-WinNonlin 6.0 released 29 May 2009

Sorry if it's off topic on this quite old discussion but I would like to confirm that Phoenix 1.0 (Phoenix WinNonlin 6.0 and Phoenix Connect 1.0) are released and available for download through the Pharsight Support site. This is an upgrade covered by your annual maintenance fee so there is no additional cost to you and it can run on the same machine as WinNonlin 5.x without interference. (The validation kit will be released in a few weeks for those of you who are looking to upgrade formally).

❝ Yesterday I downloaded Phoenix WinNonlin 6 ‘beta’ and will have a look how they ‘do’ it there. I didn’t dare to install the 156MB yet (really lots of stuff: needs M$ Visual C++ compiler and Office 2003, includes all successors of the recent version: WinNonlin, WinNonMix, IVIVC Toolkit, a new graphical engine [yes, two independent y-axes, Trellis plots!], a graphical model builder [similar to KINETICA], and, and…).


As Helmut mentions above there are many exciting and new features that I encourage you to look at; and please comment on what you like and even don't like so Pharsight can continue to improve this analysis tool.

Simon.

PS and yes the calculation of "Therapeutic Response" is still there in the NCA set-up tab to produce the following additional durations and AUCs above, within and below those limits;

TimeLow
TimeDur
TimeHgh
TimeInfHgh
AUCLow
AUCDur
AUCHgh
AUCInfHgh

Simon
Senior Scientific Trainer, Certara™
[link=https://www.youtube.com/watch?v=xX-yCO5Rzag[/link]
https://www.certarauniversity.com/dashboard
https://support.certara.com/forums/
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2009-06-04 13:57
(5670 d 02:23 ago)

@ SDavis
Posting: # 3818
Views: 34,370
 

 Phoenix-WinNonlin 6.0

Hi Simon,

thanks for the news. I would suggest that Pharsight’s customers are notified by e-mail as well.

❝ PS and yes the calculation of "Therapeutic Response" is still there...


Will be interesting to see how P1 performs. ;-)

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
SDavis
★★  
Homepage
UK,
2009-06-04 20:50
(5669 d 19:31 ago)

(edited on 2009-06-05 13:45)
@ Helmut
Posting: # 3824
Views: 35,705
 

 Phoenix-WinNonlin 6.0 - introductory webinars

Thanks Helmut. All our customers should have received the notification directly by email now; in particular letting them know about the forthcoming launch webinars;

To provide you with an opportunity to learn more about these new solutions, Pharsight is hosting three upcoming webinars, summarized below. We invite you to register and visit http://pharsight.com/events/eventsonline_schedule.php.

Introducing Phoenix™ WinNonlin® 6.0

Presenter: Daniel Weiner, Ph.D., Sr. Vice President & Chief Technology Officer
June 11, 2009 - 11 am Eastern (8 am PDT, 4 pm GMT, 5 pm CET)

This webinar provides an overview and demonstration of Phoenix WinNonlin 6.0, the next generation of the industry standard software tool for PK/PD modeling and noncompartmental analysis. The demonstration will focus on the major features of Phoenix WinNonlin 6.0, including: new workflow functionality for easy visual creation and reuse of PK/PD analyses, data visualization tools and high quality graphics, enhanced modeling capabilities, and a new underlying architecture to facilitate integration with third-party modeling and analysis tools.


A Detailed Comparison of Phoenix™ WinNonlin® 6.0 and WinNonlin 5.2.1

Presenter: Ana Henry, M.S., Director of Product Management
June 16, 2009 - 11 am Eastern (8 am PDT, 4 pm GMT, 5 pm CET)

This webinar will provide a detailed comparison of how Phoenix WinNonlin 6.0 is similar and different from WinNonlin 5.2.1 within the context of a standard noncompartmental pharmacokinetic analysis. The comparison will focus on creating graphs, running an NCA analysis, and generating tables. The presentation will also demonstrate how to refresh out of synch workflows and how to reuse previous workflows using new datasets via the new Phoenix WinNonlin 6.0 template feature.

Introducing Phoenix Connect™ 1.0

Presenter: Jason Chittenden, M.S., Director of Product Quality
July 1, 2009 11 am Eastern (8 am PDT, 4 pm GMT, 5 pm CET)

This webinar will provide an overview and demonstration of how Phoenix Connect enables interoperability between SAS, S-PLUS, NONMEM, and CDISC data sources while providing scientists the rich features of the Phoenix platform. The demonstration will feature several specific examples of how Phoenix Connect can be used to enhance the efficiency of data management, analysis, and modeling tasks.

(Remember you heard it here first, Simon ;0)

EDIT - some people said they couldn't find the list on the website so here's the link to the relevant page. There are some useful FAQ docs on the product page too.

Simon
Senior Scientific Trainer, Certara™
[link=https://www.youtube.com/watch?v=xX-yCO5Rzag[/link]
https://www.certarauniversity.com/dashboard
https://support.certara.com/forums/
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2009-06-05 01:40
(5669 d 14:40 ago)

@ SDavis
Posting: # 3826
Views: 34,080
 

 Phoenix-WinNonlin 6.0 - introductory webinars

Hi Simon,

thanks for the information!

But I have to issue a warning; to quote the Forum’s Policy:

Advertisings […] or posts that are commercial in nature are prohibited and will be removed without further notice. ;-)


Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
Astea
★★  

Russia,
2017-10-29 00:15
(2601 d 16:05 ago)

@ Helmut
Posting: # 17938
Views: 24,713
 

 TimeLow

Dear smart people!

Can anyone please suggest me an easy way of calculation T50% early and late, T75%, T90% and similar parameters via Phoenix?

To this time I could only find this old thread deducated to Therapeutic Response Module. But following this module we can only get TimeLow, TimeHigh or TimeDuring in the output of pharmacokinetical parameters.

Imagine we have an ideal standard curve, than TimeLow=T50%early+T-T50%late, TimeDuring=T50%late-T50%early (here T is the total duration of observation). So it is impossible to get T50%ealy and T50%late separately from this data. To deal with it I had to delete data after Tmax for calculating T50%early or before Tmax for calculating T50%late.

By the way, in this old thread as well as in this it was written that Phoenix approximates the decreasing part with log-transformation independetly of the rule choosen for AUC. Now it seemed to be changed, and the actual form of approximation depends on the rule for AUC, so it can be linear as well as log-linear. But this is not true for partial area calculation: for example, AUC0-72 for subjects with missing t=72 value, Phoenix always do log interpolation even if we've choosen linear AUC calculation.

Returning to TimeLow calculation. I am also puzzled why for the following dataset we get TimeLow=18 if Lower limit is choosen to be 3? Because if we substract 4 from all time points we'll get 16... (linear interpolation was selected)

   t | C
 ----+----
  4  | 6
  6  | 4
  8  | 3
  10 | 2
  12 | 0
  24 | 0

"Being in minority, even a minority of one, did not make you mad"
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2017-10-29 01:20
(2601 d 15:01 ago)

@ Astea
Posting: # 17939
Views: 24,833
 

 PHX/WNL 8.0 vs. previous releases

Hi Astea,

❝ Can anyone please suggest me an easy way of calculation T50% early and late, T75%, T90% and similar parameters via Phoenix?


Maybe later. ;-)

❝ By the way, in this old thread as well as in this it was written that Phoenix approximates the decreasing part with log-transformation independetly of the rule choosen for AUC.


Although not described in the User’s Guide, correct.

❝ Now it seemed to be changed, and the actual form of approximation depends on the rule for AUC, so it can be linear as well as log-linear.


Correct since the current release v8.0 (formulas (6) and (7) on p53 of the User’s Guide).

❝ But this is not true for partial area calculation: for example, AUC0-72 for subjects with missing t=72 value, Phoenix always do log interpolation even if we've choosen linear AUC calculation.


Like in previous versions (though I didn’t check it)-

❝ I am also puzzled why for the following dataset we get TimeLow=18 if Lower limit is choosen to be 3? Because if we substract 4 from all time points we'll get 16... (linear interpolation was selected)

 t | C

---+---

 4 | 6

 6 | 4

 8 | 3

10 | 2

12 | 0

24 | 0


Since your data set doesn’t contain a concentration at t=0, it is extrapolated (see the core output). The method depends on the Dose Options. For Extravascular we get C=0 and for IV Bolus C=13.5. Then I got these results for TimeLow in the releases of Phoenix/WinNonlin I have on my machine:

Dose Options  6.3  6.4  7.0  8.0
Extravascular  28   28   28   18
IV Bolus       31   31   31   16

I will explore it further…

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
Astea
★★  

Russia,
2017-10-29 01:49
(2601 d 14:32 ago)

@ Helmut
Posting: # 17940
Views: 24,656
 

 PHX/WNL 8.0 vs. previous releases

Dear Helmut!

Thank you for your rapid answer!

❝ Since your data set doesn’t contain a concentration at t=0, it is extrapolated (see the core output).


Oh, now it's clear. I didn't expect this clever machine to extrapolate it to zero for Extravascular. So 2 hours are exactly that goes from 0 to 4 with concentration less than 3.

"Being in minority, even a minority of one, did not make you mad"
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2017-10-30 14:18
(2600 d 01:02 ago)

@ Astea
Posting: # 17951
Views: 25,416
 

 Occupancy, Half Value Duration, Plateau Time in Phoenix/WinNonlin 8

Hi Astea,

❝ […] an easy way of calculation T50% early and late, T75%, T90% and similar parameters via Phoenix?


I don’t know any method to obtain the intersections in PHX/WNL. As discussed before, starting with release 8.0 the intersection in the decreasing part(s) depends on the selected trapezoidal method (was always lin/log in previous releases). Simple example: \(C(t)=20(e^{-log(2)/4\cdot t}-e^{-log(2)\cdot t})\)
 t    C
 0   0.000
 0.5 4.198
 1   6.818
 2   9.142
 3   9.392
 4   8.750
 6   6.759
 9   4.165
12   2.495
16   1.250
24   0.312

Calculation of the Occupancy Time (interval where concentrations are above a fixed value) is trivial.
More demanding are the Half Value Duration (HVD, t50%, POT-50), Plateau Time (t75%, POT-25), and their relatives because they depend on the subject’s Cmax. Hence, we need two steps (I used the data of above for subject 1 and ½C for subject 2); example for HVD:
  1. Send the data to NCA, map as usual (sort by subject).
    User Defined Parameters > Additional NCA Parameters Add
    Parameter: Cmax_2
    Definition: Cmax/2

    Include with Final Parameters
    Parameter NamesUse Internal Worksheet > Include in Workbook:
    Set to No for all except Cmax
    Rename the NCA object to Preliminary and execute.
  2. Send the data to NCA, map as usual (sort by subject).
    Therapeutic Response, link to Preliminary.Final Parameters Pivoted
    Sort by subject; map Cmax_2 to Lower and Cmax to Upper, execute.
TimeBetween is the HVD. IMHO, TimeLow is only interesting for a fixed limit (Occupancy Time) and if t is the intended τ.
For the linear trapezoidal you should get

subject Cmax  Cmax_2 TimeLow TimeBetween
1       9.392 4.696  16.2089 7.7911
2       4.696 2.348  16.2089 7.7911

and for the lin-up/log-down

subject Cmax  Cmax_2 TimeLow TimeBetween
1       9.392 4.696  16.3383 7.6617
2       4.696 2.348  16.3383 7.6617

Linear interpolation of the intersections will always give a larger interval than the lin/log interpolation.

[image]
0.595 to 8.386 = 7.7911

[image]
0.595 to 8.257 = 7.6617


I posted an example project at the Certara Forum.

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
Astea
★★  

Russia,
2017-10-31 02:12
(2599 d 13:08 ago)

@ Helmut
Posting: # 17957
Views: 24,732
 

 Occupancy, Half Value Duration, Plateau Time in Phoenix/WinNonlin 8

Dear Helmut!

Thank you for the detailed explanation!

Now everything became clear excepting directly intersections. If the calculation of TimeLow and TimeDur are based on the difference between intersections, why are they not included into output or elsewhere? Of course, the structure of the curve should be more complicated and there could be several intersections...

So, how to get pointed values 0.595 to 8.386 or 0.595 to 8.257 not by hand, brain, R, Excel but by Phoenix?

For a single peak chart I come up with insane ideas of initial data manipulation namely truncating or reflecting the curve.

For example, with the mentioned dataset:
  1. cutting data from 0 to Cmax we'll get TimeLow=T50%early =0.595056.
  2. cutting data from Cmax to Tlast and moving graph to zero we'll get T50%late equal to TimeHigh+Tmax= 8.386133 or 8.256799 depending on the selected trapezoidal method.
  3. reflecting the decreasing part of the curve (that is for Ct>Cmax Ct->2Cmax-Ct) and choosing Lower=0.5Cmax, Upper=1.5 Cmax, we'll get together linear TimeLow=T50%early=0.595056 and TimeLow+TimeDur=T50%late=8.386133.

[image]
[image]


Looks mad, can it be made easier?

"Being in minority, even a minority of one, did not make you mad"
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2017-10-31 13:32
(2599 d 01:48 ago)

@ Astea
Posting: # 17958
Views: 24,693
 

 Intersections

Hi Astea,

❝ Now everything became clear excepting directly intersections. If the calculation of TimeLow and TimeDur are based on the difference between intersections, why are they not included into output or elsewhere?


Ask Certara. ;-)
If there is no demand by customers it will not be implemented. Even if there is one, it may take ages. Example: The intercept of the log/linear-regression used in the estimation of λz was only part of the Core output (if Intermediate Output was selected) up to release 7.0… Since the EMA stated “by Cmin,ss we mean the concentration at the end of the dosage interval, i.e. Ctrough in 2010 it took seven years to get both Cmin (as in previous versions) and Ctau (observed if t=τ or predicted if t≠τ) in the output and the option to compute concentrations at arbitrary times (useful for Cτ in SD). Before we needed a workaround.

❝ Of course, the structure of the curve should be more complicated and there could be several intersections...


Exactly. See here, there, and my example above.

❝ So, how to get pointed values 0.595 to 8.386 or 0.595 to 8.257 not by hand, brain, R, Excel but by Phoenix?


IMHO, in Phoenix/WinNonlin impossible. In Phoenix/NLME doable since PML (Phoenix Modeling Language) offers the required features to loop through the data.
May I ask: Why do you need the intersections? Curiosity (aka “nice to know”)?

❝ For a single peak chart I come up with insane ideas of initial data manipulation namely truncating or reflecting the curve.


Nice try! As you rightly observed: Does not work for multiple intersections and/or >1 subject.

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
Astea
★★  

Russia,
2017-10-31 19:24
(2598 d 19:57 ago)

@ Helmut
Posting: # 17959
Views: 24,488
 

 nobody needs it...

Dear Helmut!

❝ May I ask: Why do you need the intersections? Curiosity (aka “nice to know”)?


Natural curiosity. Ok, it was stated in the SAP.

Evidently if there is no need for the additional parameters they woudn't be included into the output. So I can just conclude that these parameters are not fashionable ("historically there has not been a lot of people asking for it").

"Being in minority, even a minority of one, did not make you mad"
UA Flag
Activity
 Admin contact
23,336 posts in 4,902 threads, 1,669 registered users;
13 visitors (0 registered, 13 guests [including 6 identified bots]).
Forum time: 15:21 CET (Europe/Vienna)

Biostatistician. One who has neither the intellect for mathematics
nor the commitment for medicine but likes to dabble in both.    Stephen Senn

The Bioequivalence and Bioavailability Forum is hosted by
BEBAC Ing. Helmut Schütz
HTML5