Imph
★    

2022-07-12 19:05
(647 d 13:21 ago)

Posting: # 23129
Views: 2,157
 

 terminal rate constant estimation [NCA / SHAM]

Hello,

For the estimation of the terminal rate constant (lambda_z) using the two time tmax method, how do we choose the best number of points once the mono exponential phase has been identified?.

Best regards.
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2022-07-12 19:39
(647 d 12:47 ago)

@ Imph
Posting: # 23130
Views: 1,899
 

 terminal rate constant estimation

Hi Imph,

❝ For the estimation of the terminal rate constant (lambda_z) using the two time tmax method, how do we choose the best number of points once the mono exponential phase has been identified?.


Not sure whether I understand you question… With TTT you get the starting point, then use all others until tlast. However, the TTT method (like any other algorithm) works best for IR formulations but might fail on ‘flat’ profiles (controlled re­lease formulations with flip-flop PK) or on profiles of multiphasic release formulations.
Hence, visual inspection of fits is mandatory.1–4


  1. Schulz H-U, Steinijans VW. Striving for standards in bioequivalence assessment: a review. Int J Clin Pharm Ther Toxicol. 1991; 29(8): 293–8. PMID 1743802.
  2. Sauter R, Steinijans VW, Diletti E, Böhm E, Schulz H-U. Presentation of results from bioequivalence studies. Int J Clin Pharm Ther Toxicol. 1992; 30(7): 233–56. PMID 1506127
  3. Hauschke D, Steinjans V, Pigeot I. Bioequivalence Studies in Drug Development. Chichester: Wiley; 2007. p. 131.
  4. Scheerans C, Derendorf H, Kloft C. Proposal for a Standardised Identification of the Mono-Exponential Ter­mi­nal Phase for Orally Administered Drugs. Biopharm Drug Dispos. 2008; 29(3): 145–57. doi:10.1002/bdd.596.

Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
Imph
★    

2022-07-19 14:13
(640 d 18:13 ago)

@ Helmut
Posting: # 23150
Views: 1,748
 

 terminal rate constant estimation

Hello,
That answers my question perfectly thank you.

Another question please: How can we know if the sampling points chosen at the terminal phase of elimination are above the lower limit of bioanalytical quantification?

Thank you in advance.
Helmut
★★★
avatar
Homepage
Vienna, Austria,
2022-07-21 14:21
(638 d 18:05 ago)

@ Imph
Posting: # 23155
Views: 1,598
 

 optimizing schedule for late concentrations > LLOQ

Hi Imph,

❝ That answers my question perfectly thank you.


Welcome.

❝ How can we know if the sampling points chosen at the terminal phase of elimination are above the lower limit of bioanalytical quantification?


You need some information about PK of the drug. Then with a given sampling schedule and LLOQ use one of the [image]-functions at the end. If you have limited information about the PK, use the second function in order to reproduce something reported in the literature by trial and error.
Example with the first function:

D     <- 1000 # dose
f     <- 0.8  # fraction absorbed
V     <- 5    # volume of distribution
t.12a <- 0.5  # absorption half live
t.12e <- 4    # elimination half live
tlag  <- 0.25 # lag time
LLOQ  <- 5    # lower limit of quantification
t     <- c(seq(0, 3.5, 0.5),
           4, 5, 6, 8, 12, 16, 24) # sampling schedule
micro(D, f, V, k01 = log(2) / t.12a, k10 = log(2) / t.12e, tlag, LLOQ, t)

Gives:

      t         C <LLOQ
1   0.0   0.00000  TRUE
2   0.5  45.80507 FALSE
3   1.0  95.92186 FALSE
4   1.5 114.91995 FALSE
5   2.0 118.86166 FALSE
6   2.5 115.73643 FALSE
7   3.0 109.50068 FALSE
8   3.5 102.09751 FALSE
9   4.0  94.46631 FALSE
10  5.0  80.03327 FALSE
11  6.0  67.44892 FALSE
12  8.0  47.73428 FALSE
13 12.0  23.86910 FALSE
14 16.0  11.93456 FALSE
15 24.0        NA  TRUE


However, we’re not done yet! Never design a study based on an average half life. Be aware of two issues:
  1. Subjects with fast elimination will give BQLs at the end of the profile. Not desirable.
  2. Subjects with slow elimination will have a relatively high Clast, leading into trouble with the extrapolated AUC.
Therefore, explore a range of half lives:

t.12e <- seq(2, 6, 0.5) # fast to slow
res   <- data.frame(t.half = t.12e, tmax = NA_real_, Cmax = NA_real_,
                    BQLs = NA_integer_, tlast = NA_real_,
                    Clast = NA_real_, f = NA_real_, extr = NA_real_)
for (j in seq_along(t.12e)) {
  tmp          <- micro(D, f, V, k01 = log(2) / t.12a,
                        k10 = log(2) / t.12e[j], tlag, LLOQ, t)
  res$Cmax[j]  <- max(tmp$C, na.rm = TRUE)
  res$tmax[j]  <- tmp$t[which(tmp$C == res$Cmax[j])]
  res$BQLs[j]  <- sum(is.na(tmp$C))
  res$tlast[j] <- tail(t[!is.na(tmp$C)], 1)
  res$Clast[j] <- tmp$C[tmp$t == res$tlast[j]]
  res$f[j]     <- 100 * res$Clast[j] / res$Cmax[j]
  res$extr[j]  <- calc.AUC(tmp$C, t)$AUCextr.pct
}
names(res)[7:8] <- c("Clast/Cmax (%)", "extr. (%)")
print(signif(res, 4), row.names = FALSE)

 t.half tmax  Cmax BQLs tlast  Clast Clast/Cmax (%) extr. (%)
    2.0  1.5 100.6    3     8 14.540         14.450     9.400
    2.5  1.5 106.1    2    12  7.695          7.255     4.912
    3.0  2.0 111.2    1    16  5.045          4.538     3.162
    3.5  2.0 115.5    1    16  8.250          7.143     5.170
    4.0  2.0 118.9    1    16 11.930         10.040     7.480
    4.5  2.0 121.6    1    16 15.910         13.090     9.972
    5.0  2.0 123.8    0    24  6.607          5.338     4.136
    5.5  2.0 125.6    0    24  8.823          7.024     5.524
    6.0  2.0 127.2    0    24 11.230          8.830     7.032

Hence, with this LLOQ and sampling schedule all is good.
If you sample only till 16 hours:

 t.half tmax  Cmax BQLs tlast  Clast Clast/Cmax (%) extr. (%)
    2.0  1.5 100.6    2     8 14.540         14.450     9.400
    2.5  1.5 106.1    1    12  7.695          7.255     4.912
    3.0  2.0 111.2    0    16  5.045          4.538     3.162
    3.5  2.0 115.5    0    16  8.250          7.143     5.170
    4.0  2.0 118.9    0    16 11.930         10.040     7.480
    4.5  2.0 121.6    0    16 15.910         13.090     9.972
    5.0  2.0 123.8    0    16 20.030         16.180    12.550
    5.5  2.0 125.6    0    16 24.180         19.250    15.160
    6.0  2.0 127.2    0    16 28.290         22.250    17.740

Works as well but is a close shave with long elimination.

Let’s explore an example from the literature* where only little PK information is available. Half lives 1.34 – 5.87 h (geometric mean 1.93 h), tmax 0.25 – 4 h (median 0.75 h), sampling schedule 0, 0.25, 0.5, 1, 2, 4, 6, and 8 h. Lowest reported concentration 0.30 µg/mL. Hence, I set the LLOQ to 0.25 µg/mL. By trial and error coeff 60 µg/mL. Since we don’t know the hybrid absorption rate constant \(\small{\alpha}\) we can approximate it from \(\small{t_\text{1/2}}\) and \(\small{t_\text{max}}\) by numerically solving $$\log_{e}\left(\alpha-t_\text{1/2}/\log_{e}2 \right )/\left(\alpha-(\log_{e}2/t_\text{1/2}) \right)-t_\text{max}=0$$

t.12e <- c(1.34, 1.93, 5.87)
tmax  <- c(0.25, 0.75, 4)
t     <- c(0, 0.25, 0.5, 1, 2, 4, 6, 8)
LLOQ  <- 0.25
coeff <- 60
beta  <- log(2) / t.12e
res   <- data.frame(t.half = t.12e, tmax = NA_real_, Cmax = NA_real_,
                    BQLs = NA_integer_, tlast = NA_real_,
                    Clast = NA_real_, f = NA_real_, extr = NA_real_)
for (j in seq_along(t.12e)) {
  alpha        <- uniroot(opt, interval = c(0, 72), tol = 1e-8,
                          beta = beta[j], tmax = tmax[j])$root
  tmp          <- macro(coeff, alpha, beta = beta[j],
                        tlag = 0, LLOQ, t)
  res$Cmax[j]  <- max(tmp$C, na.rm = TRUE)
  res$tmax[j]  <- tmp$t[which(tmp$C == res$Cmax[j])]
  res$BQLs[j]  <- sum(is.na(tmp$C))
  res$tlast[j] <- tail(t[!is.na(tmp$C)], 1)
  res$Clast[j] <- tmp$C[tmp$t == res$tlast[j]]
  res$f[j]     <- 100 * res$Clast[j] / res$Cmax[j]
  res$extr[j]  <- calc.AUC(tmp$C, t)$AUCextr.pct
}
names(res)[7:8] <- c("Clast/Cmax (%)", "extr. (%)")
print(signif(res, 4), row.names = FALSE)

Error in est.elim(C, t) : Estimation of lambda.z not possible.
 t.half tmax  Cmax BQLs tlast   Clast Clast/Cmax (%) extr. (%)
   1.34 0.25 50.72    0     8  0.9571          1.887     1.714
   1.93 1.00 39.74    0     8  3.3910          8.533     6.423
   5.87 4.00 27.72    0     8 21.7600         78.510        NA

Agrees with was reported. The one subject with tmax at 4 h showed also the longest half life. We cannot estimate λz with the TTT-method (only one value) and max. R2adj (two values) as well.
We need to sample at least till 20 h. With t <- c(0, 0.25, 0.5, 1, 2, 4, 6, 8, 12, 16, 20):

 t.half tmax  Cmax BQLs tlast  Clast Clast/Cmax (%) extr. (%)
   1.34 0.25 50.72    3     8 0.9571          1.887     1.714
   1.93 1.00 39.74    2    12 0.8062          2.029     1.526
   5.87 4.00 27.72    0    20 5.6490         20.380    13.500

Of course, you should sample more frequently in the early part of the profile.


  • Wagener HH, Vögtle-Junkert U. Intrasubject variability in bioequivalence studies illustrated by the example of ibuprofen. Int J Clin Pharmacol Ther. 1996; 34(1): 21–31. PMID:8688993.


micro <- function(D, f, V, k01, k10, tlag = 0, LLOQ, t) {
  # one-compartment model with optional lag time: micro constants
  # concentrations < LLOQ are set to NA
  # concentrations before tmax are set to zero
  # returns data frame with time, concentration and BQL-information

  if (!isTRUE(all.equal(k01, k10))) { # common: k01 != k10
    C <- f * D * k01 / (V * (k01 - k10)) *
         (exp(-k10 * (t - tlag)) - exp(-k01 * (t - tlag)))
  } else {                            # flip-flop PK
    k <- k10
    C <- f * D / V * k * (t - tlag) * exp(-k * (t - tlag))
  }
  C[C < LLOQ] <- NA
  tmax <- t[which(C == max(C, na.rm = TRUE))]
  C[which(is.na(C[t < tmax]))] <- 0
  res <- data.frame(t = t, C = C, BQL = FALSE)
  res$BQL[res$C == 0 | is.na(res$C)] <- TRUE
  names(res)[3] <- "<LLOQ"
  return(res)
}

macro <- function(coeff, alpha, beta, tlag = 0, LLOQ, t) {
  # one-compartment model with optional lag time: macro (hybrid) constants
  # concentrations < LLOQ are set to NA
  # concentrations before tmax are set to zero
  # returns data frame with time, concentration and BQL-information

  if (!isTRUE(all.equal(alpha, beta))) { # common: alpha != beta
    C <- coeff * (exp(-beta * (t - tlag)) - exp(-alpha * (t - tlag)))
  } else {                               # flip-flop PK
    stop("flip-flop PK not implemented.")
  }
  C[C < LLOQ] <- NA
  tmax <- t[which(C == max(C, na.rm = TRUE))]
  C[which(is.na(C[t < tmax]))] <- 0
  res <- data.frame(t = t, C = C, BQL = FALSE)
  res$BQL[res$C == 0 | is.na(res$C)] <- TRUE
  names(res)[3] <- "<LLOQ"
  return(res)
}

est.elim <- function(C, t) {
  # estimate lambda.z by the ‘two times tmax’ method
  # if less than three values, try R²adj.

  data     <- data.frame(t = t, C = C)
  data     <- data[complete.cases(data), ] # discard NAs
  Cmax     <- max(data$C)
  tmax     <- data$t[data$C == Cmax]
  data     <- data[data$t >= 2 * tmax, ]   # discard values < 2 * tmax
  if (nrow(data) < 3) { # TTT-method doesn’t work - fall back to R2adj
    data   <- data.frame(t = t, C = C)
    Cmax   <- max(data$C, na.rm = TRUE)
    tmax   <- data$t[data$C[!is.na(data$C)] == Cmax]
    data   <- data[data$t > tmax, ]        # discard tmax and earlier
    data   <- data[complete.cases(data), ] # discard NAs
    if (nrow(data) < 3) stop("Estimation of lambda.z not possible.")
    lz.end <- tail(data$t, 1)
    # start with the last three concentrations
    x      <- tail(data, 3)
    r2     <- a <- b <- numeric()
    m      <- lm(log(C) ~ t, data = x)
    a[1]   <- coef(m)[[1]]
    b[1]   <- coef(m)[[2]]
    r2[1]  <- summary(m)$adj.r.squared
    # work backwards
    i      <- 1
    for (j in 4:nrow(data)) {
      i     <- i + 1
      x     <- tail(data, j)
      m     <- lm(log(C) ~ t, data = x)
      a[i]  <- coef(m)[[1]]
      b[i]  <- coef(m)[[2]]
      r2[i] <- summary(m)$adj.r.squared
      # don’t proceed if no improvement
      if (r2[i] < r2[i-1] | abs(r2[i] - r2[i-1]) < 0.0001) break
    }
    # location of the largest adjusted R2
    loc <- which(r2 == max(r2, na.rm = TRUE))
    if (b[loc] >= 0 || r2[loc] <= 0) {
      stop("Unreliable estimate - check your data.")
    } else {
      R2adj    <- r2[loc]
      intcpt   <- a[loc]
      lambda.z <- -b[loc]
      lz.start <- x$t[2]
      lz.n     <- nrow(x) - 1
      message("Less than 3 values for TTT - lambda.z estimated by R²adj.")
    }
  } else { # TTT-method
    lz.start <- head(data$t, 1)
    lz.end   <- tail(data$t, 1)
    lz.n     <- nrow(data)
    m        <- lm(log(C) ~ t, data = data)
    intcpt   <- coef(m)[[1]]
    lambda.z <- -coef(m)[[2]]
    R2adj    <- summary(m)$adj.r.squared
  }
  if (lambda.z <= 0) stop("Unreliable estimate - check your data.")
  res      <- data.frame(R2adj = R2adj, intcpt = intcpt, lambda.z = lambda.z,
                         lz.start = lz.start, lz.end = lz.end, lz.n = lz.n)
  return(res)
}

calc.AUC <-  function(C, t) {
  # calulate AUC by linear-up / log-down trapezoidal method
  # extrapolate based on the predicted Clast

  data <- data.frame(t = t, C = C, pAUC = 0, AUC = 0)
  Cmax <- max(data$C, na.rm = TRUE)
  tmax <- data$t[data$C[!is.na(data$C)] == Cmax]
  data <- data[complete.cases(data), ] # discard NAs
  for (j in 1:(nrow(data) - 1)) {
    if (data$C[j+1] < data$C[j]) { # decreasing
      data$pAUC[j+1] <- (data$t[j+1] - data$t[j]) * (data$C[j+1] - data$C[j]) /
                        log(data$C[j+1] / data$C[j])
    } else {                       # increasing or equal
      data$pAUC[j+1] <- 0.5 * (data$t[j+1] - data$t[j]) *
                        (data$C[j+1] + data$C[j])
    }
  }
  data$AUC    <- cumsum(data$pAUC)
  extr        <- est.elim(C, t)[2:3]
  Clast.pred  <- exp(extr[[1]] - extr[[2]] * tail(data$t, 1))
  AUCinf.pred <- as.numeric(tail(data$AUC, 1) + Clast.pred / extr[[2]])
  AUCextr.pct <- 100 * (AUCinf.pred - tail(data$AUC, 1)) / AUCinf.pred
  return(list(AUC = data, Clast.pred = Clast.pred,
              AUCinf.pred = AUCinf.pred,
              AUCextr.pct = AUCextr.pct))
}

opt <- function(x, beta, tmax) {
  # find alpha for given beta and tmax
  suppressWarnings(log(x / beta) / (x - beta) - tmax)
}


Dif-tor heh smusma 🖖🏼 Довге життя Україна! [image]
Helmut Schütz
[image]

The quality of responses received is directly proportional to the quality of the question asked. 🚮
Science Quotes
UA Flag
Activity
 Admin contact
22,988 posts in 4,825 threads, 1,654 registered users;
87 visitors (0 registered, 87 guests [including 5 identified bots]).
Forum time: 08:26 CEST (Europe/Vienna)

The whole purpose of education is
to turn mirrors into windows.    Sydney J. Harris

The Bioequivalence and Bioavailability Forum is hosted by
BEBAC Ing. Helmut Schütz
HTML5