Aceto81 Regular Belgium, 20080626 16:31 Posting: # 1973 Views: 15,386 

Hi all, I'm working at a small pharmaceutical company, and I'm doing my first (small) steps in the field of PK/PD. I recieved an article from a collegue, which I should be able to reproduce (article: PK and PK/PD of doxycycline in drinking water after therapeutic use in pigs, J. vet. Pharmacol. Therap. 28, 525530, 2005.) Cmax, Cmin, AUC, AUCss, I'm able to reproduce, but I'm struggling with the time above the MIC90 and %time above MIC90. for example: if I have a MIC90 for Pasteurella multocida of 0.517µg/ml and the following data:
time Subj conc Prats et al. came up with a time of 102.5 hours and 94.9% of total time in SS. I have no idea about how to find the 102.5 hours. I searched in the forum's archive, in the manual of WinNonLin, on the web, but couldn't find any references. Maybe someone can give me some help? I'm routinely use R, but now I'm also using WinNonLin. Thanks in advance Ace  Edit: Category changed; table reformated using BBCodes. [Helmut] 
Helmut Hero Vienna, Austria, 20080626 18:58 @ Aceto81 Posting: # 1975 Views: 14,122 

Dear Aceto81! » ...I'm struggling with the time above the MIC90 and %time above MIC90. » Prats et al. came up with a time of 102.5 hours and 94.9% of total time in SS. Looking at you data, this seems to be a reasonable value. » I have no idea about how to find the 102.5 hours. » I searched in the forum's archive, in the manual of WinNonLin, on the web, but couldn't find any references. It's not implemented in WinNonlin; but I'm right now at a conference in London where one of the lecturers is Jason Chittenden of Pharsight  I will ask him. I don't know any reference either  except one by Jerome Skelly of the FDA, maybe 20 years ago. Start searching with the term "occupany time" or just "occupancy". » Maybe someone can give me some help? » I'm routinely use R, but now I'm also using WinNonLin. If the values fall below the MIC and come up again (like in your case) do a linear interpolation. An algorithm is like this:
I just made a unchecked hotelroom quickshot in R; 116.5 hr >0.517 ug/ml within 0  124 hr?!! — Cheers, Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. ☼ Science Quotes 
Aceto81 Regular Belgium, 20080626 21:50 @ Helmut Posting: # 1976 Views: 13,896 

Dear Helmut, » Actually just start at the beginning of the profile and add all segments which are abobe the MIC up. If the values fall below the MIC and come up again (like in your case) do a linear interpolation. I would use rather R than WinNonlin for that purpose. Probably I was doing something wrong, maybe sitting too long after the PC, but when I tried this approach, I wasn't even near.... But anyway, with your reply, I started over and finally it works! Thanks for your help (again) Ace PS: for anyone who may find it interesting: The R code for finding time below the threshold: "dat" is a dataframe containing column "conc", and "time" th = threshold
f < function(dat,th=0.517) { Edit: Reformatted using BBCodes. [Helmut] 
Helmut Hero Vienna, Austria, 20080627 00:18 @ Aceto81 Posting: # 1977 Views: 14,047 

Dear Ace, I'm too tired to fire up R and check your code, but here's a second quickshot in Excel  I didn't save my Rcode and wanted to get an independent solution. We have three segments in the profile above the threshold (I inserted conc=0 at t=0). Segment 1: 0.404 > 21.96 (delta 21.56) — Cheers, Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. ☼ Science Quotes 
Helmut Hero Vienna, Austria, 20080629 02:24 @ Aceto81 Posting: # 1984 Views: 13,899 

Dear Ace, » Prats et al. came up with a time of 102.5 hours and 94.9% of total time in SS. I overlooked this hint. The end of the last dosage interval therefore is 102.5/.949 = 108 hours, which is strange. I would set it to the time of the last dose plus one dosage interval (tau). If tau=12h, then the end should be set to 120h. » But anyway, with your reply, I started over and finally it works! OK, checked it again, this time with the last time point 108 hours (later samples dropped): time < c(0.5,1,3,5,8,12,24,36,48,60,72,84,96,108) Now I'm getting 103.6 hours (Coverage 95.9%) in agreement with Excel and even a manual calculation. I have no idea how the reference's results (102.5 hours/ 94.9%) were obtained (last 120 h yields in 114.5 h / 95.4%). The R code right now needs a little cosmetics for t=0/c=0 (returns 'error in if (!is.na(dat$conc[i  1])) { : Argument has lenght 0' )  although the result is still correct.BTW, I looked the reference up; I'm afraid it will not be very helpful: JP Skelly Issues and controversies involving controlledrelease drug product studies Pharmacy International, Nov. 1986 (280286) — Cheers, Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. ☼ Science Quotes 
Aceto81 Regular Belgium, 20080630 15:50 @ Helmut Posting: # 1988 Views: 13,954 

Hi HS, the time above MIC was calculated during the Steady State, which goes from 12 hours to 120 hours. That's why the results differs. (I was struggling with this part as well ). Here the updated code:
time < c(0.5, 1, 3, 5, 8, 12, 24, 36, 48, 60, 72, 84, 96, 108, 120, Thanks for your valuable input Ace 
Helmut Hero Vienna, Austria, 20080630 16:45 @ Aceto81 Posting: # 1989 Views: 14,116 

Dear Ace, OK, now everything’s clear. I like the way you set the limits – I would have used brute force and removed the datapoints from the data.frame… In case you need the reference  it’s only one paragraph and a figure: Occupancy time Where a therapeutic window exists, ‘Therapeutic Occupancy Time’^{1}, the time that the plasma concentration stays within the therapeutic range, becomes an important criterion. In steadystate, the percentage of time the drug concentrations lies within the therapeutic window is important. In the cited example, the drug lies within the therapeutic window for the subject population about 80% of the time (Fig. 4).The figure is a nice illustration of the interpolation/intersection. It should be noted that there’s also an upper limit, which complicates things: two formulations may have identical occupancy times from differently shaped curves! If not only the MIC but also toxicity is an issue, I would not suggest using the Occupancy Time without a thorough inspection of individual profiles. Another one, mixing up concentration with effect a little bit (Goodman & Gilman’s The Pharmacological Basis of Therapeutics, p19, McGrawHill, 11^{th} ed. 2006):
— Cheers, Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. ☼ Science Quotes 
Helmut Hero Vienna, Austria, 20080808 20:38 @ Helmut Posting: # 2164 Views: 14,842 

Dear Ace! » » I searched in […] the manual of WinNonLin, but couldn't find any references. » » It's not implemented in WinNonlin; but I'm right now at a conference in London where one of the lecturers is Jason Chittenden from Pharsight  I will ask him. OK, since Pharsight invited my to become a WinNonlin 6 beta tester, I'm just digging a little bit deeper into this stuff. I must correct myself  it's already there, but until today I didn't find it myself (in the online help: Noncompartmental Analysis > Therapeutic response). From your workbook, open NCA model 200. After performing the usual steps (dragging time and concentration to the respective fields, entering dose,…) goto Model > Therapeutic Response… ; in the tab Therapeutic Response enter 0.517 to the field Lower . Click OK and run the calculation. You find the time above MIC (0.517) in the new workbook in the field TimeHgh .Now for the surprise: If the workbook contains only data from 12120h, WinNonlin comes up with 109.3966 (and TimeLow 10.6034) which adds up nicely to 120h, but nobody asked for it. Next I added a new timecolumn subtracting 12h from the original ones (i.e., running from 0108h). Results: TimeHgh 101.5885 and TimeLow 6.4115… Again not the 102.5h we would expect from the reference, your wonderful Rcode and my quickshot in Excel. Next I suspected some kind of interpolation issue, because I've set my default in WinNonlin to lin/log interpolation Tools > Options > Models > Default Parameter Options > NCA calculation method > Linear Trapezoidal (Linear/Log Interpolation) . Changing to Linear Trapezoidal (Linear Interpolation) gave the same result.I don't know how the result is calculated… Next step:
x  y A simple triangle  MIC set to 0.5; expected t>0.5 = 1  Bingo! So maybe it's a problem with adding segments:
x  y Expected 2, reported 2… I checked some of my old datasets and always got differences to WinNonlin's results (never more than 5%, but also no agreement in a single case). Another example:
t  C Dont' be shocked about the profile  it's a formulation with two absorption phases. I wanted to calculate the Half Value Duration (time interval where C > 50% C_{max}, aka HVD). Unfortunately it's not possible to enter a formula in the respective column  in column B there's the value of C_{max}, but entering =B1*0.5 to cell C1 gives Must enter numeric value OK .Therefore the calculation has to be done somewhere else… In the example C_{max}/2=2.268. Now let's do it the hard way: the first intersection is between 11.5 [1.144<2.268<2.399], the second one between 69 [3.387>2.268>1.643]. No fiddling around with linear regression, just a plain linear interpolation (paper, pencil, brain). The first intersection is 1.5+(11.5)×(2.2682.399)/(1.1442.399)=1.4478, the second one 9+(69)×(2.2681.643)/(3.3871.643) = 7.9249. HVD = 7.92491.4478 = 6.4771. WinNonlin comes up with 6.2153. Now I got the idea that WinNonlin might interpolate logarithmically in the decreasing part 9+(69)×(ln2.268ln1.643)/(ln3.387ln1.643) = 9+(69)×(0.81890.49652)/(1.219940.49652) = 7.6631. HVD = 7.66311.4478 = 6.2153; Q.E.D.! Actually this algorithm is reasonable (following the same logic as the linup/logdown option in AUC calculation)  but it's nowhere documented, and I couldn't find a way to change this setting. The green line shows the intersection in linear scale (conventional method). In logarithmic scale it's clear that WinNonlin's method (blue line) intersects the curve just at the right spot. IMHO the method is nice, if stated in the protocol  and you don't have to struggle in recalculating old studies or comparing your data with the literature  which may drive you nuts. — Cheers, Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. ☼ Science Quotes 
Aceto81 Regular Belgium, 20080812 09:56 @ Helmut Posting: # 2169 Views: 13,662 

Wow Helmut, you did a hell of a job. Great thinking! Do you think that the log/linear approach for the descending part is better than the classic linear approach? As you already stated, the elimination phase is log/linear, so maybe this is more correct than the 'classic' approach. Or maybe we can choose, as long as we clearly report which method we used? Best regards Ace PS: anyone interested in the full R code with an extension to choose log or not? 
Helmut Hero Vienna, Austria, 20080812 12:32 @ Aceto81 Posting: # 2170 Views: 14,989 

Dear Ace! » you did a hell of a job. » Great thinking! To be honest, it was more of tryanderror than thinking. Reading the onlinehelp (and better: the User's Guide) was just a little helpful. Only once I found out what WinNonlin was doing, I could understand the explanations – I’m getting old… » Do you think that the log/linear approach for the descending part is better than the classic linear approach? » As you already stated, the elimination phase is log/linear, so maybe this is more correct than the 'classic' approach. Yes, I would think so – especially if time points are rather ‘far’ apart. » Or maybe we can choose, as long as we clearly report which method we used? According to WinNonlin’s User’s Guide (page 186, PDF page 204) both options should be possible to choose from; but whatever I tried always ended up lin/log… Another nuisance: for MR formulations it would be nice to calculate the time interval, where C≥50%C_{max} (Half Value Duration, HVD) or C≥75%C_{max} (Plateau Time, t_{75%}). In the ‘Therapeutic response windows’ tab WinNonlin shows in one column individual’s C_{max}, but does not allow to enter a formula in the next column. So it's time for copyandpaste… For my twosegment absorption example it’s even worse: If I’m interested in the first and second part (like using WinNonlin’s partial area method) nothing helps but deleting the respective time points from the workbook. If you just exclude them, WinNonlin will still present the global C_{max} only. Yesterday I downloaded WinNonlin 6 Phoenix Beta and will have a look how they ‘do’ it there. I didn’t dare to install the 156MB yet (really lots of stuff: needs M$ Visual C++ compiler and Office 2003, includes all successors of the recent version: WinNonlin, WinNonMix, IVIVC Toolkit, a new graphical engine [yes, two independent yaxes, Trellis plots!], a graphical model builder [similar to KINETICA], and, and…). » PS: anyone interested in the full R code with an extension to choose log or not? Yes, me. I gave it a short try with a start time = 0, and got the index error again. If you go with a logversion you probably must add small dummy concentrations for values below the LLOQ. — Cheers, Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. ☼ Science Quotes 
Aceto81 Regular Belgium, 20080813 11:19 @ Helmut Posting: # 2176 Views: 13,635 

Dear Helmut, I changed the function radically, so here is version 2, with the data you provided (BLQ was changed to NA):
dat < Best regards Ace 
Helmut Hero Vienna, Austria, 20080813 13:18 @ Aceto81 Posting: # 2177 Views: 13,785 

Dear Ace! Thanks for you efforts! I’m a little bit short in time, so expect testing from my side later. I think a critical point are intersections between time points where one is >LLOQ and the other is <LLOQ. WinNonlin has some kind of homebrew for it… — Cheers, Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. ☼ Science Quotes 
Aceto81 Regular Belgium, 20080813 15:35 @ Helmut Posting: # 2181 Views: 13,636 

Dear HS » Thanks for you efforts! I’m a little bit short in time, so expect testing from my side later. No problem, I'm short in time too, but if you find some bugs, let me know and I will fix it as soon as possible. » I think a critical point are intersections between time points where one is >LLOQ and the other is <LLOQ. WinNonlin has some kind of homebrew for it… I'm not sure about how I should interpret this, if you can give an example that should be great. But if you let me know about how to treat this things, I'm sure I can put it in the function. Ace 
SDavis Senior UK, 20090604 10:55 (edited by sdavis on 20090604 11:31) @ Helmut Posting: # 3816 Views: 13,926 

Sorry if it's off topic on this quite old discussion but I would like to confirm that Phoenix 1.0 (Phoenix WinNonlin 6.0 and Phoenix Connect 1.0) are released and available for download through the Pharsight Support site. This is an upgrade covered by your annual maintenance fee so there is no additional cost to you and it can run on the same machine as WinNonlin 5.x without interference. (The validation kit will be released in a few weeks for those of you who are looking to upgrade formally). » Yesterday I downloaded WinNonlin 6 'Phoenix' Beta and will have a look how they 'do' it there. I didn't dare to install the 156MB yet (really lots of stuff: needs M$ Visual C++ compiler and Office 2003, includes all successors of the recent version: WinNonlin, WinNonMix, IVIVC Toolkit, a new graphical engine [yes, two independent yaxes, Trellis plots!], a graphical model builder [similar to KINETICA], and, and…). As Helmut mentions above there are many exciting and new features that I encourage you to look at; and please comment on what you like and even don't like so Pharsight can continue to improve this analysis tool. Simon. PS and yes the calculation of "Therapeutic Response" is still there in the NCA setup tab to produce the following additional durations and AUCs above, within and below those limits; TimeLow — Simon Senior Scientific Consultant Pharsight  A Certara™ Company Simon Davis at LinkedIn Forthcoming meetings and training Problems are not stop signs, they are guidelines. Robert H. Schuller 
Helmut Hero Vienna, Austria, 20090604 11:57 @ SDavis Posting: # 3818 Views: 13,274 

Hi Simon, thanks for the news. I would suggest that Pharsight's customers are notified by email as well. » PS and yes the calculation of "Therapeutic Response" is still there... Will be interesting to see how P1 performs. — Cheers, Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. ☼ Science Quotes 
SDavis Senior UK, 20090604 18:50 (edited by sdavis on 20090605 13:45) @ Helmut Posting: # 3824 Views: 13,781 

Thanks Helmut. All our customers should have received the notification directly by email now; in particular letting them know about the forthcoming launch webinars; To provide you with an opportunity to learn more about these new solutions, Pharsight is hosting three upcoming webinars, summarized below. We invite you to register and visit http://pharsight.com/events/eventsonline_schedule.php. Introducing Phoenix™ WinNonlin^{®} 6.0 Presenter: Daniel Weiner, Ph.D., Sr. Vice President & Chief Technology Officer June 11, 2009  11 am Eastern (8 am PDT, 4 pm GMT, 5 pm CET) This webinar provides an overview and demonstration of Phoenix WinNonlin 6.0, the next generation of the industry standard software tool for PK/PD modeling and noncompartmental analysis. The demonstration will focus on the major features of Phoenix WinNonlin 6.0, including: new workflow functionality for easy visual creation and reuse of PK/PD analyses, data visualization tools and high quality graphics, enhanced modeling capabilities, and a new underlying architecture to facilitate integration with thirdparty modeling and analysis tools. A Detailed Comparison of Phoenix™ WinNonlin^{®} 6.0 and WinNonlin 5.2.1 Presenter: Ana Henry, M.S., Director of Product Management June 16, 2009  11 am Eastern (8 am PDT, 4 pm GMT, 5 pm CET) This webinar will provide a detailed comparison of how Phoenix WinNonlin 6.0 is similar and different from WinNonlin 5.2.1 within the context of a standard noncompartmental pharmacokinetic analysis. The comparison will focus on creating graphs, running an NCA analysis, and generating tables. The presentation will also demonstrate how to refresh out of synch workflows and how to reuse previous workflows using new datasets via the new Phoenix WinNonlin 6.0 template feature. Introducing Phoenix Connect™ 1.0 Presenter: Jason Chittenden, M.S., Director of Product Quality July 1, 2009 11 am Eastern (8 am PDT, 4 pm GMT, 5 pm CET) This webinar will provide an overview and demonstration of how Phoenix Connect enables interoperability between SAS, SPLUS, NONMEM, and CDISC data sources while providing scientists the rich features of the Phoenix platform. The demonstration will feature several specific examples of how Phoenix Connect can be used to enhance the efficiency of data management, analysis, and modeling tasks. (Remember you heard it here first, Simon ;0) EDIT  some people said they couldn't find the list on the website so here's the link to the relevant page. There are some useful FAQ docs on the product page too. — Simon Senior Scientific Consultant Pharsight  A Certara™ Company Simon Davis at LinkedIn Forthcoming meetings and training Problems are not stop signs, they are guidelines. Robert H. Schuller 
Helmut Hero Vienna, Austria, 20090604 23:40 @ SDavis Posting: # 3826 Views: 13,253 

Hi Simon, thanks for the information! But I have to issue a warning; to quote the Forum’s Policy: Advertisings […] or posts that are commercial in nature are prohibited and will be removed without further notice. — Cheers, Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. ☼ Science Quotes 
Astea Regular Russia, 20171028 22:15 @ Helmut Posting: # 17938 Views: 3,988 

Dear smart people! Can anyone please suggest me an easy way of calculation T_{50%} early and late, T_{75%}, T_{90%} and similar parameters via Phoenix? To this time I could only find this old thread deducated to Therapeutic Response Module. But following this module we can only get TimeLow, TimeHigh or TimeDuring in the output of pharmacokinetical parameters. Imagine we have an ideal standard curve, than TimeLow=T_{50%}^{early}+TT_{50%}^{late}, TimeDuring=T_{50%}^{late}T_{50%}^{early} (here T is the total duration of observation). So it is impossible to get T_{50%}^{ealy} and T_{50%}^{late} separately from this data. To deal with it I had to delete data after T_{max} for calculating T_{50%}^{early} or before T_{max} for calculating T_{50%}^{late}. By the way, in this old thread as well as in this it was written that Phoenix approximates the decreasing part with logtransformation independetly of the rule choosen for AUC. Now it seemed to be changed, and the actual form of approximation depends on the rule for AUC, so it can be linear as well as loglinear. But this is not true for partial area calculation: for example, AUC_{072} for subjects with missing t=72 value, Phoenix always do log interpolation even if we've choosen linear AUC calculation. Returning to TimeLow calculation. I am also puzzled why for the following dataset we get TimeLow=18 if Lower limit is choosen to be 3? Because if we substract 4 from all time points we'll get 16... (linear interpolation was selected)
t  C 
Helmut Hero Vienna, Austria, 20171028 23:20 @ Astea Posting: # 17939 Views: 3,962 

Hi Astea, » Can anyone please suggest me an easy way of calculation T_{50%} early and late, T_{75%}, T_{90%} and similar parameters via Phoenix? Maybe later. » By the way, in this old thread as well as in this it was written that Phoenix approximates the decreasing part with logtransformation independetly of the rule choosen for AUC. Although not described in the User’s Guide, correct. » Now it seemed to be changed, and the actual form of approximation depends on the rule for AUC, so it can be linear as well as loglinear. Correct since the current release v8.0 (formulas (6) and (7) on p53 of the User’s Guide). » But this is not true for partial area calculation: for example, AUC_{072} for subjects with missing t=72 value, Phoenix always do log interpolation even if we've choosen linear AUC calculation. Like in previous versions (though I didn’t check it) » I am also puzzled why for the following dataset we get TimeLow=18 if Lower limit is choosen to be 3? Because if we substract 4 from all time points we'll get 16... (linear interpolation was selected) » t  C » + » 4  6 » 6  4 » 8  3 » 10  2 » 12  0 » 24  0 Since your data set doesn’t contain a concentration at t=0, it is extrapolated (see the core output ). The method depends on the Dose Options . For Extravascular we get C=0 and for IV Bolus C=13.5. Then I got these results for TimeLow in the releases of Phoenix/WinNonlin I have on my machine:
— Cheers, Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. ☼ Science Quotes 
Astea Regular Russia, 20171028 23:49 @ Helmut Posting: # 17940 Views: 3,932 

Dear Helmut! Thank you for your rapid answer! » Since your data set doesn’t contain a concentration at t=0, it is extrapolated (see the core output ). Oh, now it's clear. I didn't expect this clever machine to extrapolate it to zero for Extravascular. So 2 hours are exactly that goes from 0 to 4 with concentration less than 3. 
Helmut Hero Vienna, Austria, 20171030 13:18 @ Astea Posting: # 17951 Views: 3,857 

Hi Astea, » […] an easy way of calculation T_{50%} early and late, T_{75%}, T_{90%} and similar parameters via Phoenix? I don’t know any method to obtain the intersections in PHX/WNL. As discussed before, starting with release 8.0 the intersection in the decreasing part(s) depends on the selected trapezoidal method (was always lin/log in previous releases). Simple example: C=20(ℯ^{–ln(2)/4*t} – ℯ^{–ln(2)*t}) t C Calculation of the Occupancy Time (interval where concentrations are above a fixed value) is trivial. More demanding are the Half Value Duration (HVD, t_{50%}), Plateau Time (t_{75%}), and their relatives because they depend on the subject’s C_{max}. Hence, we need two steps (I used the data of above for subject 1 and ½C for subject 2); example for HVD:
TimeBetween is the HVD. IMHO, TimeLow is only interesting for a fixed limit (Occupancy Time) and if t is the intended τ.For the linear trapezoidal you should get
I posted an example project at the Certara Forum. — Cheers, Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. ☼ Science Quotes 
Astea Regular Russia, 20171031 01:12 @ Helmut Posting: # 17957 Views: 3,791 

Dear Helmut! Thank you for the detailed explanation! Now everything became clear excepting directly intersections. If the calculation of TimeLow and TimeDur are based on the difference between intersections, why are they not included into output or elsewhere? Of course, the structure of the curve should be more complicated and there could be several intersections... So, how to get pointed values 0.595 to 8.386 or 0.595 to 8.257 not by hand, brain, R, Excel but by Phoenix? For a single peak chart I come up with insane ideas of initial data manipulation namely truncating or reflecting the curve. For example, with the mentioned dataset: 1) cutting data from 0 to C_{max} we'll get TimeLow=T_{50%}^{early} =0.595056. 2) cutting data from C_{max} to T_{last} and moving graph to zero we'll get T_{50%}^{late} equal to TimeHigh+T_{max}= 8.386133 or 8.256799 depending on the selected trapezoidal method. 3) reflecting the decreasing part of the curve (that is for C_{t}>C_{max} C_{t}>2C_{max}C_{t}) and choosing Lower=0.5C_{max}, Upper=1.5 C_{max}, we'll get together linear TimeLow=T_{50%}^{early}=0.595056 and TimeLow+TimeDur=T_{50%}^{late}=8.386133. Looks mad, can it be made easier? 
Helmut Hero Vienna, Austria, 20171031 12:32 @ Astea Posting: # 17958 Views: 3,791 

Hi Astea, » Now everything became clear excepting directly intersections. If the calculation of TimeLow and TimeDur are based on the difference between intersections, why are they not included into output or elsewhere? Ask Certara. If there is no demand by customers it will not be implemented. Even if there is one, it may take ages. Example: The intercept of the log/linearregression used in the estimation of λ_{z} was only part of the Core output (if Intermediate Output was selected) up to release 7.0… Since the EMA stated “by C_{min,ss} we mean the concentration at the end of the dosage interval, i.e. C_{trough}” in 2010 it took seven years to get both Cmin (as in previous versions) and Ctau (observed if t=τ or predicted if t≠τ) in the output and the option to compute concentrations at arbitrary times (useful for C_{τ} in SD). Before we needed a workaround.» Of course, the structure of the curve should be more complicated and there could be several intersections... Exactly. See here, there, and my example above. » So, how to get pointed values 0.595 to 8.386 or 0.595 to 8.257 not by hand, brain, R, Excel but by Phoenix? IMHO, in Phoenix/WinNonlin impossible. In Phoenix/NLME doable since PML (Phoenix Modeling Language) offers the required features to loop through the data. May I ask: Why do you need the intersections? Curiosity (aka “nice to know”)? » For a single peak chart I come up with insane ideas of initial data manipulation namely truncating or reflecting the curve. Nice try! As you rightly observed: Does not work for multiple intersections and/or >1 subject. — Cheers, Helmut Schütz The quality of responses received is directly proportional to the quality of the question asked. ☼ Science Quotes 
Astea Regular Russia, 20171031 18:24 @ Helmut Posting: # 17959 Views: 3,723 

Dear Helmut! » May I ask: Why do you need the intersections? Curiosity (aka “nice to know”)? Evidently if there is no need for the additional parameters they woudn't be included into the output. So I can just conclude that these parameters are not fashionable ("historically there has not been a lot of people asking for it"). 