El "estándar de oro" en la identificación de substancias

Gas Chromatography/Mass Spectography (GC/MS) has been widely heralded as a «gold standard» for forensic substance identification. The reputation of this technique is not the product of new developments in cutting-edge science, but instead reflects its origin in the successful combination of two considerably older and well-tested processes. The two processes have now been used in conjunction for nearly 40 years. But over these last four decades—especially during the last decade—GC/MS technology has increasingly transformed in terms of application and equipment specifications. This article reflects upon the traditional forensic uses of GC/MS in light of new GC/MS manifestations and applications. One wonders, for example, whether the gold standard is being “ratcheted up” or whether what these new technologies in effect prove too much. To that end, this article also addresses certain radically enhanced configurations such as tandem GC/MS and high-speed GC/MS.
Tandem GC/MS is contextualized in terms of novel applications which identify trace elements in materials previously thought to be disintegrated beyond identification, while high-speed GC/MS is contextualized by new instruments that instantly detect certain signature molecules, instantly identifying certain substances in luggage or even on human beings. However, in order to appreciate the radical departures, this paper first focuses on GC/MS in its most conventional “first generation” form.
First generation GC/MS and scientific evidence applicationTo appreciate the implications of GC/MS advances for scientific evidence, it is helpful first to discuss GC/MS generally in its long-standing role. GC/MS has been, and continues to be, a specific test for the identification of substances. In the courtroom, it has been principally applied to the areas of drug testing, fire investigation, environmental analysis, and explosives investigation. In these areas GC/MS is considered to be reliable, universally accepted, and the “gold standard.”
One of the reasons GC/MS is well-regarded is that it is a specific test—as opposed to a non-specific test. It can be used to positively identify the actual presence of a particular substance in a given sample. A non-specific test, however, merely indicates that a substance falls within a category of substances. Many other particular and sometimes unrelated substances may also fall within such a category. Typically the non-specific test elicits a reaction popularly or statistically associated with the presence of a certain substance. But such a “positive” reaction may in fact be a false positive because it can be triggered by other substances. Unlike the use of GC alone, GC/MS is itself a specific test. However, it can be applied to a plethora of substances unlike many other specific tests, as discussed below. Thus when GC/MS succeeds in making an identification, it positively identifies substances.
The GC/MS process
Although many aspects of GC/MS have changed, the essential functions of the components involved have not. The most fundamental component distinction is between gas chromatography (GC) and mass spectography (MS). In GC/MS, the primary purpose of gas chromatography is to separate the sample it analyzes. It would render a compound consisting of several chemical elements—into independent chemical elements. Bear in mind that GC is a test unto itself, albeit an non-specific one: GC can determine the proportions of the components of the mixture, but does not specifically identify the components. In GC analysis, one is left to infer the identity of a component through its retention time, although these are values are neither unique nor truly precise. However, GC is unmatched in its ability to separate compounds, and this separation function is essentially the exclusive function of the GC segment within a GC/MS unit. The mass spectography unit receives each molecule from the GC unit and measures the mass of the molecule. Further, it breaks the molecule into fragments, and measures the charge and mass of each fragment. This process culminates in the identification of the molecule.
GC components
In spite of advances and improvements, gas chromatography equipment has consisted of five essential pieces for forty years: a carrier gas supply, a separation column, the inlet/injection port, an oven, a detector, and a data system. There are two very important expendables added to this setup, namely the stationary phase and the mobile phase.
The purpose of the stationary phase is to effectuate the correct separation. The stationary phase is contained within the separation column. It is so-called because the stationary phase material remains immobilized during the GC process. The column contents are contained by tubing, sometimes made of glass or metal, but the dominant trend is towards fused silica because, unlike glass, it is physically flexible. Also, unlike metal tubing, it virtually never catalyzes with the sample.
There are three primary kinds of separation columns, each of which describes how the stationary phase is applied: in liquid form, solid form, and within a capillary tube. The nature of the stationary phase explains the formal designation for each process, namely, Gas-Liquid Chromatography (GLC), Gas-Solid Chromatography (GSC), and Capillary Gas Chromatography (Capillary GC). That designation refers to entire GC process involving that particular column. The difference between each process is essentially how and where the stationary phase is applied. GLC, the most common procedure, requires that an inert “support” substance be coated with the liquid stationary phase before being packed into the tubing. In GSC, the tube is also packed, but it is packed with stationary phase granules, which would be solid in form. In Capillary GC, an comparatively narrow tube is itself coated with the stationary phase. The substance is immobilized not by being packed into the tube, but by bonding to the inside of the tubing itself. The stationary phase adheres to the tubing either by the tubing’s physical properties or by an adhesive-like polyimide coating. The ideal stationary phase tends to have a heavy atomic weight and will result in the components of the sample to become distributed variously between the mobile and stationary phase.
The mobile phase consists of the carrier gas which must be inert to the column materials and the sample to be analyzed. The gases of choice are generally hydrogen and helium because they are typically inert in most situations. Also, they are ideal because they move efficiently due to a low ratio of diffusion to viscosity.
Beyond the specific properties for the column types described above, the operating principals are fundamentally the same among the various GC processes. The gas is already cycling through the GC system. First the sample is introduced into the mobile phase through the injection port. This was once almost exclusively done through a hypodermic-style syringe. Recently an automated form of injection is gaining currency in state-of-the-art labs to reduce error caused by improper operator technique. Moreover, such injectors permit the regulated injection of pre-heated samples and some even permit the regulated injection gasses. Next the components of the sample begin to separate. The components of the sample with more affinity to the stationary phase move at a slower rate relative to the flow of the mobile phase. Because there are different rates of movement, sample components separate into identifiable and measurable groups. Since the liquor or solid samples must be kept in a gaseous state after entering the GC process, the column is heated by ovens that maintain a suitable temperature. Sometimes the heating unit will be used to “ramp,” or gradually and systematically increase the temperature, which can cause the components with lower boiling points to exit the column sooner. Thus, in addition to the stationary phase, another resisting force causing the separation is heat. In due course, all the components exit the column and enter MS unit.
MS components
The MS unit consists of five primary features: an ionizer, the magnetic region, a detector, and a recorder. When the stream exiting the GC unit enters the MS unit, it first goes into the ionizer. The ionizer inundates the sample material with an electron beam, causing it to become positively charged, if it wasn’t already. This results in fission of components into smaller fragments. Because every type of component molecule fragments according to properties inherent to that component, each has a unique fragmentation pattern. The magnetic region may take two forms, a quadrapole magnetic focusing or a single-magnet analyzer tube. The quadrapole setup uses four magnets to focus specific molecules with a certain atomic mass (i.e., weight) through a small slot. The molecules that have a different atomic mass would merely bounce around the magnetic region until the time for its turn comes to be successfully focused by the four magnets through the slot. Alternatively, the single-magnet analyzer chamber will deflect the various particles through an electromagnetic field within a long curved tube. The lighter particles traverse the analyzer tube the fastest. The particles emerge from the magnetic region and strike the detector, transferring its charge. This activates the recorder which takes note of the atomic mass through the mass/charge ratio and evaluates concentration of that molecule contained in the sample.
GC/MS analysis
The analyst then attempts to identify a substance, interpreting the generated spectrum by comparing the relative concentrations among the atomic masses. Two kinds of analysis are possible at this point: comparative and original. Comparative analysis essentially compares the spectrum associated with various compounds from a spectrum library to see if its characteristics are present within the sample. This is best performed by computer because there are a myriad of visual distortions that can take place due to variations in scale. Also, because the conditions used to generate the library will probably not be the same visually, an analyst might be trying to compare apples to oranges. Computers tend to do a better job in this regard because they can simultaneously correlate more data (such as the retention times identified by GC), to more accurately relate certain data.
Another analysis measures the peaks in relation to one another, the tallest receiving 100% of the value, and the others receiving proportionate values—all values above 3% must be accounted for. The parent peak indicates the total mass of the unknown compound and reflects the highest mass on the spectrum. Based on this value, the analyst must attempt to fit all the total masses of the other particles into that value. Beyond that, a molecular structure and bonding must be defined, consistent with a substance with the characteristics recorded by GC/MS. This method too is best performed by computer because of possibility for human error and because the computer can exhaust all mathematical possibilities in making the analytical determinations. A computer can effectively and rather instantly do both processes with the lowest rate of error and the maximum confirmation.
A “full spectrum” analysis considers all the “peaks” within a spectrum. However, another method is selective ion monitoring (SIM), which looks only at a few characteristic peaks associated with a candidate substance. A lab does this on the assumption that a set of ions is characteristic of a certain subject. It is fast and efficient which makes the analysis less expensive for the lab, but this is a scientific hypothesis in itself.
GC/MS attacks and counterattacks
Equipment/operator problems
The injection port septum utilized in GC/MS systems is multi-use, but its longevity ranges from 100 to 200 uses based on the type of material and the temperatures to which it is subjected. Given the range in lifespan of the injection port septum, an opponent may challenge the test results on the basis that a bad injection port was used. This is especially true if there are no records on the use of the injection port, or if it was used beyond its lower threshold. Analysis can be especially compromised if some of the sample leaks out, potentially distorting quantization. This can be cured by systematically changing septa before the low-end of their life expectancy. In addition, automatic injection mechanism eliminates this problem by eliminating the low-life expectancy parts and also can play a role in monitoring the integrity of the system, guarding against leaks. In fact, leaks in general can be problematic. Incorrectly sealed column tubing can leak to the point of causing a bad separation that can skew any values obtained. To prevent this problem from occurring in the first place, a lab can make it its practice to seal parts like the column tubing with a bonding material such as polyimide. Bad separation can often be identified by peaks on the chromatogram overlapping. If someone attacking the result of GC/MS can point to such a chromatogram, the reliability of that evidence is suspect. The person in a position to introduce such evidence should be unwilling to do so when there is such a chromatogram.
If the heat is incorrectly applied, especially due to bad instrumentation, it can adversely affect the reading as well. If thermostat sensors in the heating unit become desensitized, things can be systematically overheated to the point of causing structural changes or decomposition in the sample. It is also important to ensure that any part of the GC/MS assembly is not contaminated by a sample fragment left behind by a prior test. This can happen when a stream is big enough to be accommodated by the GC system, but not by the MS system. This possibility could be detected by running a “blank” test, but huge laboratory backlogs make this problematic, given the length of time that GC/MS takes. Therefore, another way to address this problem would be to ensure that no more than a specified amount ever be placed into the machine. Of course, procedures don’t negate human error unless they are followed. Although more expensive, an automated sampling/injection system may eliminate most forms of human error related to intake. Because automatic injection units can typically be pre-loaded with a string of samples to analyze, this ultimately may lead to increased efficiency. Another, perhaps less costly option for labs that have several GC/MS apparatuses is, as standard procedure, to run confirmatory tests on positive samples on a different machine.
Basic attacks on equipment procedure and maintenance can also be made. Attention should be paid to cleaning procedures and whether they are in accordance with the manufacturer-specified cleaning procedure, in terms of both where to clean, frequency, and methodology. Similarly, one attacking GC/MS should follow up on the regular maintenance issues, such as parts replacements, regular service, and self-diagnostic techniques.
Interpretation
The problem of interpretation has been steadily shifting over the years. As previously described, there is room for human error, especially in terms of scale, computation, and identification. In a laboratory environment where there is constant push to do things faster and more efficiently, if someone is not using a computer to make analytical computations, they are likely to be cutting corners to compensate for the inherent slowness of manual analysis. It is especially important to ensure that selective ion monitoring is not trying to be passed off by the opposing attorney as full-scan GC/MS. Alternatively, when a lab determines a result to be positive using comparisons to stock libraries, it can reflect differing conditions and scales that make a visual comparison rather difficult if not deceptive. Computers are a more ideal basis of comparison, because the computer compares the raw data, rather than a visual representation of the data. If an analyst desires to examine the comparisons in detail, the comparisons would be presented on the correct scale due to adjustments the computer can make. However, the libraries on computers are still generated under different conditions, only some of which may be compensated for. The differing conditions, however, may not make comparisons impossible: even in different environments proportions of chemical composition and the atomic weights of constituent atoms would be represented accurately. Such would be the case if both the library-generated test and the laboratory test were correctly administered. Despite the many variables, GC/MS should be always able to correctly assess quantity and identity (via atomic mass) in a way that correlates objectively to reality.
Interpretation in the courtroom
Provided the lab is relying on state-of-the-art computers, one must ensure they are not so state-of-the-art that they do not fall outside the requirements of the relevant scientific evidence standard, whether it be Frye Standard or Daubert Standard. Otherwise a proponent of such evidence may end up having to empirically validate a computer program or an aspect of such a program. Additionally, since many computations are made algorithmically rather than by visual comparison and extrapolation, one might miss ordinary indicators of a potential problem. One such example would be a chromatogram, which is helpful in showing overlapping peaks. If an analyst foregoes some of the graphic representations, it is imperative that the program perform the same function. However, it is certainly to the protection of the analyst to produce as much documentation as possible and give it a once over just to ensure there are no obvious problems.
Next generation GC/MSGC/MS has long been a laboratory technique. Delicate, sensitive, scientific hardware lends itself to the laboratory, especially when it takes up the better part of a room. However, new innovations in GC/MS technology are bringing the technology into the field. The technology is getting smaller—even portable. And the processes are being done faster—sometimes fifty times faster, sometimes hundreds of times faster.
Technological improvements
Speed: high speed GC / time-of-flight MS
A set of new technologies, when although combined in different ways, are generically referred to as “high speed” GC/MS, or more accurately as high speed GC/time-of-flight MS. “High speed” is probably an apt descriptor because new products can complete analysis very quickly. For example, in 1996 the top-of-the-line high-speed GC/MS units completed analysis of fire accelerants in less than 90 seconds, whereas first-generation GC/MS would have required at least 16 minutes. Two primary improvements in terms of speed were made to the GC unit, one is that the column heater now envelops the column, so instead of heating at a rate of 30˚C per minute, it can heat at 100˚C per second.
The accompanying mass spectography unit is in a new class of units designated “time of flight” (TOFMS). TOFMS makes its measurements by sending fragments down a low-pressure tube called a “drift tube,” where it hits the detector in order of their mass, and a measurement is made of the time in flight of each. The old class of units could only produce a few spectra per second at their top speed, and quite typically far fewer. 500 mass spectra per second can now be obtained using TOFMS. It also has a virtually unlimited mass range, whereas standard MS has more limitations. These rates allow for allow for accurate tracking of the considerably narrower peaks in generated by high speed gas chromatography (HSGC).
New software allows for “spectral deconvolution” which addresses the problem of overlapping chromatogram peaks where the overlapping components have different fragmentation patterns. However, when it comes to very complex mixtures, some chromatographic peaks may overlap to the point that deconvolution cannot rectify, even with the highest available rate of data acquisition. Luckily, this is limited to extremely complex mixtures (15+ primary compounds). Most useful is the automated capability to automatically define peaks—and in some cases not just relatively, but quantitatively.
Accuracy: MS/MS and MSn
Another technology that has come to improve accuracy is MS/MS. The original “tandem MS,” from which this technology derived, involved two MS units and was dubbed MS/MS. Interestingly, MS/MS has separation capabilities on its own, but we will consider it in conjunction with GC because GC/MS/MS lends itself to similar applications as GC/MS. MS/MS can work either physically in tandem (with two physical units), or using an ion trap (using only one physical unit). The ion trap MS unit is generally the configuration of choice because it is less expensive and more efficient. MS/MS is a relatively new player in the world of forensics and scientific evidence, although the technology has existed since the 1970s. Only recently, however, has MS/MS become less prohibitively arcane and expensive.
GC/MS/MS functions just like GC/MS but with an additional MS step. Some particles will be analyzed with greater specificity in that final step. In the case of an actual two MS tandem, a vacuum pump would be required to transport particles from the first MS and place them into the second. In the case of an ion trap, the ions to be further tested collect in the trap until they are reanalyzed by the MS process.
The “second round” of MS involves isolating an ion of interest and routing it to a collision cell. Within the collision cell, the ion experience collisions from an argon or an inert gas. These “parent” ions are put through the mass analyzer and the resulting fragmentation produces “daughter” ions. This brings an ion of interest into the foreground even in an otherwise convoluted and noisy picture, clarifying the relationship of small ions to larger ions in the spectrum.
GC/MS applications
GC/MS is a technique the very process of which is technologically and conceptually impressive. After all, it rips substances apart into molecules, and shatters molecules into fragments and measures their speed once energized, and it even identifies what the substance is. But more impressive in a forensic sense is a highly portable device that analyzes the particles from a human body in order to link him or her to a crime. Also impressive is to take something destroyed beyond all recognition, even beyond the comprehension of GC/MS, and apply a ‘super’ GC/MS, purporting to show an arson or drug ingestion. This section begins with GC/MS applications discussed in three vignettes
GC/MS/MS: reconstructing molecular fingerprints
The analysis of fire debris using GC/MS is “well established” and there has even been established American Society for Testing Materials (ASTM) standard for fire debris analysis. Analysis of fire debris is constantly complicated by pyrolysis product, the result of combustion-produced decomposition and transformation. The pyrolysis product is often a dominant portion of a sample. It can complicate identification of an accelerant because the residue of gasoline is very slight compared to the proportion of pyrolysis product.
Figure 1 illustrates this problem, comparing an analysis of a gasoline sample with carpet burnt with gasoline. Based on the figure 1, an analyst could form “no positive conclusion that gasoline was used as an accelerant.” However, based on figure 2, an analyst would be able to testify that the correlation “excellent” between the gasoline and the carpet sample.
Figure 2 was not, however, the necessary outcome of GC/MS/MS analysis. The analyst, based on prior information or possibilities illuminated by a GC/MS spectrograph, programs specific ions to be further analyzed in the second MS process. In Figure 2, the analyst chose key compounds characteristic of aromatic. This made a direct correlation between these standard compounds to the burnt material possible.
EGIS: science at the airport
A post-September 11th development, explosives-detection systems will soon be part of all US airports. The post-9/11 Transportation Security Agency is installing hardware to screen of 100% of passenger baggage at all of the US’s 429 airports by December 31, 2002. This will involve the deployment of 4,700-6,000 trace detection systems nation-wide, and will require $427 million in additional spending. While full-blown explosive detection systems cost roughly $1,000,000, explosives trace detection systems cost roughly $50,000. These bargain-priced systems run on a host of technologies, many of them based on GC/MS. There are only three manufacturers certified by the FAA to provide these systems, one of which is Thermo Detection (formerly Thermedics), which produces the EGIS, a GC/MS-based line of explosives detectors.
The EGIS-type can use an 8-pound portable vacuum system that sucks in air around luggage to be analyzed. The vacuum functions as the sampler of the GC/MS system. Alternatively, a cloth can be used to wipe surfaces and the cloth can be inserted directly into an injection port in the GC/MS system. The sample is analyzed with the push of a button. If the unit detects an explosive, it will activate an “audible, LCD, or LED” alarm.
Sentor: Identifying particles of guilt
Consider the following scenario:
On the morning of March 29, 1993, a United States Navy surveillance aircraft, a P3 Orion, was on a routine drug-interdiction mission in international waters off the coast of the Dominican Republic. Spotting a low-profile vessel in the waters below, the aviators identified the boat to be similar to the type of vessel commonly used in narcotics smuggling. After the Orion made several passes and witnessed the ship’s crew members tossing bales overboard into the ocean, small arms tracer rounds came streaming towards the plane. Soon thereafter, Coast Guard officials aboard a Navy frigate, U.S.S. TAYLOR, intercepted and boarded the low-profile vessel. After an exhaustive search, the American officials were unable to locate any contraband on the boat or on the defendants.
Law enforcement officials, attempting to connect the bales of cocaine with the defendants, produced the Sentor, a state-of-the-art electronic device able to detect the faintest molecular traces of cocaine. The officers, using what looked like a large hand-held flashlight, approached the defendants and pointed the device towards their bodies. The machine began to vacuum in a large volume of air around the defendants bodies. The officers then took samples of the air on board the boat. Within thirty seconds, the drug-interdiction officials were able to detect trace amounts of cocaine on both the defendants and the boat. Based upon this and other evidence, the defendants were arrested and later convicted of smuggling narcotics.
Real scenarios like the one presented above are becoming increasingly common, as the drug-sniffing GC/MS-powered Sentor currently is in use by drug enforcement agents, the coast guard, and the border patrol. It both suggests the benefits of such a system, but also elicits worries. This particular scenario inspires the audience to want the test to confirm what we already “knew”: that the boat crew were ejecting bricks of cocaine into the ocean. We already knew they were guilty. In one scene we go from the Coast Guard officials take air samples from the boat and from the bodies of each individual to the happy ending. In a fast time frame for even Hollywood, less than a minute later, the aviators’ visual observations were “proven” through the magic of GC/MS and the machine called Sentor.
The Sentor is actually a derivative of the EGIS machine described in previous section, but instead of being optimized for explosives detection, it is optimized to detect drugs. The following technical description applies also the EGIS hardware.
Consistent with the GC/MS enhancements described earlier (in section 3.1.1), the unit employs high speed GC/MS accompanied by time-of-flight mass spectography. An interesting enhancement is the portable vacuum air auto-sampler, which was described above as resembling a large, handheld flashlight. Unlike traditional GC/MS where samples are injected with a syringe into the mobile state, the auto-sampler directly fits into an auto-intake/injection system.
The operation of the machine is simple. The auto-sampler is inserted into a machine resembling a refrigerator, the GC/MS process begins, and finally it displays the quantity and amount of drugs detected. Unlike the laboratory environment, there is no room to fine-tune. But there is also little possibility of operator error. This principal of operation, in the words of manufacturer Thermedics Detection, Inc., can be described as “simple one-person, one-button operation.” After all, this application was not designed for scientists.
The foregoing description describes Sentor in the criminal context, largely because of the particular agencies involved with the technology at present. However, Pinkerton Security and Investigation Services has entered a marketing agreement to administer the test in the workplace. It would be run by “specially trained Pinkerton guards” who would analyze everything and everyone in a 15,000 square foot business for approximately $5,000. In this civil context, Sentor-type technology raises some perplexing issues.
Reflection: Creepy little particles of guilt
The Supreme Court determined in United States v. Place that sniffs by trained drug dogs did not constitute a «search» within the meaning of the Fourth Amendment. The high-tech sniffing that goes on goes on with GC/MS technology is considerably more powerful than the sniffing ability of dog. But in a certain sense, GC/MS is doing the same thing—“inhaling” certain particles and “alerting” when they are the bad ones. The worry however, is that, given the fact that suspicious particles are in the air and can be inadvertently contracted through others and objects like money, this “dog” could “alert” on anyone. This could give police—should the police begin its widespread use—the power to do a more invasive search and just use the Sentor as a pretext. Perhaps, however, the Sentor may be more like thermal infrared imaging, which the Supreme Court held to be unconstitutional search in Kyllo v. United States.
Another element for this discussion is that of Federal Rule of Evidence 403. Because of the ability to tie people to certain stashes of drugs based on their chemical makeup, there doubtlessly would be some instances where Sentor-type results would be tried to be introduced as evidence in itself. This certainly was the case for the drug traffickers off the coast of the Dominican Republic. The fact that this evidence “magically” discerns the presence of illicit substances might render it substantially more prejudicial than probative. Because the GC/MS configuration in the Sentor-type situation is essentially a “press here, dummy” configuration, one may be left with the impression that because it so simply indicates the presence of some drug residue, then a person who triggers its alarm must be “simply guilty.” The allure of the magic box, in conjunction with the misconception that people with drugs on them (however slight) are drug users, may cause an average juror to become prejudiced. While we know that jurors express a healthy skepticism to scientific evidence and expert testimony, in the case of this situation, it doesn’t necessarily have all the off-putting qualities of scientific machinery. For one, the Sentor and EGIS are very simple when it comes to their interface. However, this prejudice may be cured by effective lawyering on the other side, not in terms of the technicalities of GC/MS, but rather in showing that how common contamination is.
Questions of first impression about GC/MS/MS were raised in United States v. Campbell. In this case, Private Campbell’s urine tested positive for LSD only under GC/MS/MS at Northwest Toxicology Laboratories, but not under conventional tests at Fort Meade which included radioimmunoassay analysis (RIA). The RIA test indicated some LSD in the urine, but did not rise to the 200 picogram standard established by the military. Only with the GC/MS/MS test did the military discover 307 picograms of LSD, enough to convict Campbell for using LSD. The defense successfully maintained that GC/MS/MS is not reliable under MRE 702, primarily arguing that it did not meet the Daubert test. The U.S. Court of Appeal for the Armed Forces (CAAF) reversed the intermediate court, maintaining the trial court had correctly found that it did not meet Daubert standard for reliability:
“Of critical importance is that the Government did not prove the levels or frequency of error, which would indicate: (1) that the particular GC/MS/MS test reliably detected the presence of LSD metabolites in urine; (2) that GC/MS/MS reliably quantified the concentration of those metabolites; and (3) that the DoD cutoff level of 200 pg/ml was greater than the margin of error and sufficiently high to reasonably exclude the possibility of a false positive and establish the wrongfulness of any use. In particular, the Government introduced no evidence to show that it had taken into account what is necessary to eliminate the reasonable possibility of unknowing ingestion or a false positive.”
Falling far short of an in-depth treatment of treatment of Campbell, this introduction serves to highlight the fact that GC/MS/MS is not GC/MS, even though GC/MS/MS relies on GC/MS. While GC/MS’s is nearly-universally accepted as the “gold standard” in a broad array of applications, one ought not assume that this reverence extends much further.
The code section under which Private Campbell was prosecuted required “knowing use” and this knowing use could theoretically be established upon a certain level of narcotic in the bloodstream. One issue that surfaces in a number of novel GC/MS applications is the meaning of standards. It is troubling that, upon failing to achieve the threshold, one might turn to novel technology which can prove more of a substance is there. It is not automatically clear whether the first test understated the presence of a chemical, or the later test overstated its presence. But it does make sense that, as in Campbell, there should be burden to establish that a test proving the presence of a substance beyond established tests should be required to prove its quantification is reliable in its own regard. Otherwise the application of standards may not be a function of objective testing, but instead a function of “test shopping.”
Sentor, which relies on a technology already proven to be the “gold standard” for drug testing, may be open to a similar attack. Even though High Speed Gas Chromatography (HSGC)/ Time-of-Flight Mass Spectography (TOFMS) upon which Sentor relies is essentially a “souped up” version of first-generation GC/MS, one may question whether it is still essentially GC/MS, based on its hardware. A court may require a separate foundation apart from GC/MS. Adding hardware to increase precision such as an addition MS unit, as above, seems to make GC/MS/MS different enough from GC/MS to warrant independent establishment of its validity. In the case of replacing slower hardware with faster hardware (as with HSG/TOFMS), does that also require a separate showing of reliability from other GC/MS technologies?
The proponent of such evidence may argue that it does not warrant a separate showing because there are many variants among hardware configurations in GC/MS setups that have various characteristics—such as faster or slower speed—and such configurations have always been treated interchangeably in case treatments of GC/MS. Because the procedure is based upon the same essential technology, it could be argued, it should not therefore be treated differently than GC/MS has been. However, perhaps more powerfully, one could argue that Sentor is fundamentally different: HSGC/TOFMS is lower resolution, and while it is widely suggested that it makes up for that fact by performing 500 analyses per second, it remains important that an equivalency be evidenced.
Of course, would Sentor readings really be used for evidence? At first it might not seem so. It might merely be a screening test that gives rise to further confirmatory tests. But think above the scenario detailed above where the Coast Guard employs Sentor to link the accused to bricks of cocaine in the ocean. The fact that the Sentor reported a presence of cocaine, and a mixture of chemicals being identical to that of the cocaine in the bricks is the most incriminating fact in the scenario, as described. Certainly, the analysis of the cocaine in the bricks may be performed later in the laboratory to verify the results of the portable test. However, it is unlikely that the Coast Guard preserved an additional air sample. But this perhaps highlights another point, that even when the identification itself is not very suspect, the source of the sample might be suspect.
The Sentor (and the EGIS for that matter) sample imprecisely, especially when they are analyzing air. To suggest that there is much precision to vacuuming particles from around one’s person is quite suspect. Air is fluid and does not cling to the body like leaves hang on a tree. Contamination with an illicit substance may come from the environment. The particles that cling to someone may be there because they have come into contact with someone else. Individuals are constantly involuntarily and inadvertent bearers of other people’s particles. Along that line of thought, one ought to consider that, in order to activate a Sentor, one need only come in contact with an individual who had handled such drugs recently. Such an individual might be a police officer, a drug dealer, or a pharmacist. Certainly, only a miniscule amount of particles would transfer through casual contact. However, due to Sentor’s use of GC/MS, it is capable of discovering traces of narcotics like cocaine amounting to less than one billionth (0.000000001) of a gram. Moreover, most money in currency is contaminated with cocaine residue. Anyone who constantly handles money could theoretically set off a Sentor.
The concern for incidental contamination is not so far-fetched. It was a concern that was addressed by the Campbell court. Incidental contamination also poses an increasingly monumental danger when the otherwise insignificant contaminants become magnified along with same sample: what was once incidental looms quite large. It is essential to bring this problem into focus before the courts by suggesting, where appropriate, that mountains are being made of molehills. Even in proportional magnification, there is the danger that the sample itself is rather small, and it ceases to be representative of the whole. Imagine a sample from a haystack with a needle in it, consisting of 3 pieces of hay and 1 needle. Bear in mind that unlike a blood or urine sample, some of these more novel sampling techniques will feature clustering of a residue. A gust of wind might “link” someone to a massive drug operation, because a big fragment stuck to your lapel.
Standards can solve some of this problem. A standards-setting organization can determine at what level of reading something can be truly probative. How this would be determined is problematic. But assuming it can be done, perhaps one can come up with a standard that excludes the amount of particles that be merely blown or inadvertently rubbed onto a body. But can the same be said for GC/MS/MS? GC/MS/MS can really look behind a situation that is dominantly “debris” and see the gasoline that was used to burn the house down. But of what probative value is this, when perhaps GC/MS/MS detects not the accelerant in the fire but instead BBQ fluid that had spilled inadvertently into the carpet months ago on the Fourth of July? At some point, detecting too little may be too much for the courts, because alternative explanations can effectively rebut minute samples.
See alsoMass spectrometry
Mass spectrometry is a technique for separating ions by their mass-to-charge (m/z) ratios.
Mass spectrometry can be divided into two broad applications: identification of compounds by the mass or determination of the isotopic composition of one or more elements in a compound.
The most common form of mass spectrometry is gas chromatography-mass spectroscopy (GC/MS). In this technique a gas chromatograph is used to separate compounds. This stream is fed into the ion source, a metallic filament to which voltage is applied. This filament discharges electrons which ionizes compounds. The ions can then further fragment, yielding predictable patterns. The stream then passes into the detector.
….. Click the link for more information.
There are more than one type of capillary columns used in a GC/MS system. First there is the Wall Coated Open Tubular (WCOT). This system is the most advanced as there are an infinite number of theoretical plates*. This system coats only the interior tubing with an open center. This system allows for best separation. Another is the Packed or PLOT. This system uses packed plates, which, at its developement allowed for an increase in theoretical plates but is not infinite. The third is a Stationary or SCOT. This system uses only a single layer of plating, which is a fairly uncommon way of ussing a GCMS (now that WCOT exists).
Theoretical plates: at it’s developement, a separtory column was used to seperate each component. To do this, the sample moved between literal plates whereby each component could separate (or at least to some degree) on. With a WCOT, the theoretical number of plates is determined by dividing the length of the capillary tube by the number of theoretical plates. Number of theoretical plates is defined as: N=16(tri/Wi)^2
preview not available. Click the link for more information.

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *