Ahem – number of units? [Dissolution / BCS / IVIVC]
Hi Hötzi,
Am thinking loud here: You can bootstrap a principal answer to such questions as well.
You generate a dissolution dataset with n chambers for T and R; then you resample with replacement to generate datasets with m chambers (m > n, presumably) per product, to find out at which level of m you get an f2 limit >50.
This is quite simple, actually, to do the programming and simulation. But yes, if you end up with 326 chambers per product then I guess you are royally screwed.
We could do a publication about this. We'd need a candidate dataset, I don't think I have one for this which I could put into the public domain.
Could this even be done as a two-stage approach? You first look at bootstrap f2 levels in the initial n-chamber scenario. If you are passing, stop. Else, find your sample size m and run the (m-n) chambers on top, then re-bootstrap the whole thing. This would be a "two-treatment, m-chamber, tripple bootstrap, two-stage dissolution trial", and you may need to work with increased coverage intervals for alpha preservation.
Man, I should win a Nobel prize for this idea.
❝ Sure. The problem is the ‘regulatory crep’ I suspected in my post. Jiři showed nice simulations about the false positive rate. To squeeze the lower CL ≥50 is not another league but another sport. How many units will we have to test in the future? 24, 48, 96?
Am thinking loud here: You can bootstrap a principal answer to such questions as well.
You generate a dissolution dataset with n chambers for T and R; then you resample with replacement to generate datasets with m chambers (m > n, presumably) per product, to find out at which level of m you get an f2 limit >50.
This is quite simple, actually, to do the programming and simulation. But yes, if you end up with 326 chambers per product then I guess you are royally screwed.
We could do a publication about this. We'd need a candidate dataset, I don't think I have one for this which I could put into the public domain.
Could this even be done as a two-stage approach? You first look at bootstrap f2 levels in the initial n-chamber scenario. If you are passing, stop. Else, find your sample size m and run the (m-n) chambers on top, then re-bootstrap the whole thing. This would be a "two-treatment, m-chamber, tripple bootstrap, two-stage dissolution trial", and you may need to work with increased coverage intervals for alpha preservation.
Man, I should win a Nobel prize for this idea.
—
Pass or fail!
ElMaestro
Pass or fail!
ElMaestro
Complete thread:
- EMA: Mahalanobis distance is dead... Helmut 2018-11-09 18:01 [Dissolution / BCS / IVIVC]
- EMA: Mahalanobis distance is dead... ElMaestro 2018-11-09 19:12
- Is ƒ2 dead as well – always bootstrapping? Helmut 2019-10-06 15:27
- Is ƒ2 dead as well – always bootstrapping? ElMaestro 2019-10-06 17:54
- Ahem – number of units? Helmut 2019-10-06 18:14
- Ahem – number of units?ElMaestro 2019-10-06 22:45
- Ahem – number of units? wienui 2019-10-07 19:11
- Ahem – number of units? elba.romero 2021-10-28 00:08
- Ahem – number of units - ICH M9 Relaxation 2021-10-28 16:20
- Ahem – number of units?ElMaestro 2019-10-06 22:45
- Ahem – number of units? Helmut 2019-10-06 18:14
- Is ƒ2 dead as well – always bootstrapping? ElMaestro 2019-10-06 17:54
- Is ƒ2 dead as well – always bootstrapping? Helmut 2019-10-06 15:27
- EMA: Mahalanobis distance is dead... ElMaestro 2018-11-09 19:12