LSB - evil, terrible and annoying [Bioanalytics]
Hi,
...and how data are captured electronically:
Data acquisition is typically done on a voltage range (e.g. 0 - 5 mV); data acquisition hardware has a property related to resolution that is measured in bits.
The number of bits determines the minimum change in voltage that can be detected. For example, an 8-bit data acquisition system splits the capture range into 28 = 256 units each having the dimension of the smallest theoretically detecable amount of voltage change; if the capture range is 0-5 mV then the smalles change in voltage that can be detected is 5 mV/256 = 0.0195 mV (corresponding in theory1 to the least significant bit, LSB).
Practical proof of the presence of an LSB:
Take any captured chromatogram and enlarge any portion of it with your software. A piece of baseline is fine for this purpose. Superficially it looks like a flat line with noise. Enlarge even more and sooner or later a distinct saw-tooth like pattern becomes evident. The values that are captured jump between discrete levels - these levels represent the LSB.
The consequence of it all is that for small peaks, the LSB can in some cases be of "signficiant" magnitude. Conversely, as peaks get larger, the digital signal can for all practical purposes be considered continuous. Most software peak detection Al Gore Rhytms are based on such an assumption, mainly because it is terribly difficult to write anything meaningful that is based on a signal that makes quantum leaps, and if such software would be functional noone would be able to tell how small a peak should be before algo1 should be used in stead of algo2 in a given run. And people would prolly start wars when discussing the same issue between runs. And some intelligence agency would prolly want that algo and prevent it from being published if it became a reality.
As computers and data acquisition boards get better (more bits!) the significance (no pun intended) of the LSB is in practice slowly disappearing, fortunately. But it is still in 2010 A.D. showing its ugly face with a few drugs out there.
I agree very much with the essense of HS' post: Common sense and human eyeballing is better by a large factor than any detection algo.
Best regards
EM
1: Real life is cruel. The LSB is never as "good" as in this theory due to factors like electronic noise, voltage drift and more. A 16 bit system is often only 14-15 bit in practice.
❝ Let’s have a look how peak integration in chromatography is performed.
...and how data are captured electronically:
Data acquisition is typically done on a voltage range (e.g. 0 - 5 mV); data acquisition hardware has a property related to resolution that is measured in bits.
The number of bits determines the minimum change in voltage that can be detected. For example, an 8-bit data acquisition system splits the capture range into 28 = 256 units each having the dimension of the smallest theoretically detecable amount of voltage change; if the capture range is 0-5 mV then the smalles change in voltage that can be detected is 5 mV/256 = 0.0195 mV (corresponding in theory1 to the least significant bit, LSB).
Practical proof of the presence of an LSB:
Take any captured chromatogram and enlarge any portion of it with your software. A piece of baseline is fine for this purpose. Superficially it looks like a flat line with noise. Enlarge even more and sooner or later a distinct saw-tooth like pattern becomes evident. The values that are captured jump between discrete levels - these levels represent the LSB.
The consequence of it all is that for small peaks, the LSB can in some cases be of "signficiant" magnitude. Conversely, as peaks get larger, the digital signal can for all practical purposes be considered continuous. Most software peak detection Al Gore Rhytms are based on such an assumption, mainly because it is terribly difficult to write anything meaningful that is based on a signal that makes quantum leaps, and if such software would be functional noone would be able to tell how small a peak should be before algo1 should be used in stead of algo2 in a given run. And people would prolly start wars when discussing the same issue between runs. And some intelligence agency would prolly want that algo and prevent it from being published if it became a reality.
As computers and data acquisition boards get better (more bits!) the significance (no pun intended) of the LSB is in practice slowly disappearing, fortunately. But it is still in 2010 A.D. showing its ugly face with a few drugs out there.
I agree very much with the essense of HS' post: Common sense and human eyeballing is better by a large factor than any detection algo.
Best regards
EM
1: Real life is cruel. The LSB is never as "good" as in this theory due to factors like electronic noise, voltage drift and more. A 16 bit system is often only 14-15 bit in practice.
—
Pass or fail!
ElMaestro
Pass or fail!
ElMaestro
Complete thread:
- Manual integration sagark 2010-07-29 02:26 [Bioanalytics]
- Manual integration Helmut 2010-07-29 15:30
- Manual integration ElMaestro 2010-07-29 21:07
- History Helmut 2010-07-30 01:30
- LSB - evil, terrible and annoyingElMaestro 2010-07-30 23:46
- Manual integration ElMaestro 2010-07-29 21:07
- Bad integration: Example Helmut 2010-07-30 20:24
- Bad integration: Example sagark 2010-07-31 12:57
- Bad integration: Example Helmut 2010-08-01 02:09
- Better integration? Example Helmut 2010-08-12 20:11
- Better integration? Example ElMaestro 2010-08-12 23:14
- Better algorithms / more awareness of analysts Helmut 2010-08-13 13:43
- Better integration? Example ElMaestro 2010-08-12 23:14
- Better integration? Example Helmut 2010-08-12 20:11
- Bad integration: Example Helmut 2010-08-01 02:09
- Bad integration: Example sagark 2010-07-31 12:57
- Manual integration keshav khude 2010-08-13 09:11
- Manual integration Helmut 2010-07-29 15:30