Pro Academia Help

Standard Practice for Quality System in Petroleum Products and Lubricants Testing Laboratories1

Designation: D 6792 – 07 An American National Standard

Standard Practice for Quality System in Petroleum Products and Lubricants Testing Laboratories1

This standard is issued under the fixed designation D 6792; the number immediately following the designation indicates the year of original adoption or, in the case of revision, the year of last revision. A number in parentheses indicates the year of last reapproval. A superscript epsilon (e) indicates an editorial change since the last revision or reapproval.

1. Scope*

1.1 This practice covers the establishment and maintenance of the essentials of a quality system in laboratories engaged in the analysis of petroleum products and lubricants. It is de- signed to be used in conjunction with Practice D 6299.

NOTE 1—This practice is based on the quality management concepts and principles advocated in ANSI/ISO/ASQ Q9000 standards, ISO/IEC 17025, ASQ Manual,2 and ASTM standards such as D 3244, D 4182, D 4621, D 6299, D 6300, E 29, E 177, E 456, E 548, E 882, E 994, E 1301, E 1323, STP 15D,3 and STP 1209.4

1.2 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appro- priate safety and health practices and determine the applica- bility of regulatory requirements prior to use.

2. Referenced Documents

2.1 ASTM Standards: 5

D 3244 Practice for Utilization of Test Data to Determine Conformance with Specifications

D 4182 Practice for Evaluation of Laboratories Using ASTM Procedures in the Sampling and Analysis of Coal and Coke

D 4621 Guide for Quality Management in an Organization That Samples or Tests Coal and Coke

D 6299 Practice for Applying Statistical Quality Assurance Techniques to Evaluate Analytical Measurement System Performance

D 6300 Practice for Determination of Precision and Bias Data for Use in Test Methods for Petroleum Products and Lubricants

D 6617 Practice for Laboratory Bias Detection Using Single Test Result from Standard Material

E 29 Practice for Using Significant Digits in Test Data to Determine Conformance with Specifications

E 177 Practice for Use of the Terms Precision and Bias in ASTM Test Methods

E 456 Terminology Relating to Quality and Statistics E 548 Guide for General Criteria Used for Evaluating

Laboratory Competence6

E 882 Guide for Accountability and Quality Control in the Chemical Analysis Laboratory

E 994 Guide for Calibration and Testing Laboratory Ac- creditation Systems General Requirements for Operation and Recognition6

E 1301 Guide for Proficiency Testing by Interlaboratory Comparisons

E 1323 Guide for Evaluating Laboratory Measurement Practices and the Statistical Analysis of the Resulting Data

2.2 ISO Standards:7

ISO Guide 30 Terms and Definitions Used in Connection with Reference Materials

ISO/IEC 17025 General Requirements for the Competence of Testing and Calibration Laboratories

ISO 4259 Petroleum Products—Determination and Appli- cation of Precision Data in Relation to Methods of Test

ANSI/ISO/ASQ Q9000 Quality Management System Stan- dards

3. Terminology

3.1 Definitions: 3.1.1 accepted reference value, ARV, n—a value that serves

as an agreed upon reference for comparison, and which is derived as: (1) a theoretical or established value, based on scientific principles, (2) an assigned value, based on experi- mental work of some national or international organization

1 This practice is under the jurisdiction of ASTM Committee D02 on Petroleum Products and Lubricants and is the direct responsibility of Subcommittee D02.94 on Coordinating Subcommittee on Quality Assurance and Statistics.

Current edition approved July 1, 2007. Published August 2007. Originally approved in 2002. Last previous edition approved in 2006 as D 6792–06.

2 “Quality Assurance for The Chemical and Process Industries: A Manual of Good Practices,” 1987, available from American Society for Quality (ASQ), 600 N. Plankinton Ave., Milwaukee, WI 53203. www.asq.org.

3 ASTM STP 15D, ASTM Manual on Presentation of Data and Control Chart Analysis, ASTM International, W. Conshohocken, PA.

4 ASTM STP 1209, ASTM Manual on Total Quality Management, ASTM International, W. Conshohocken, PA.

5 For referenced ASTM standards, visit the ASTM website, www.astm.org, or contact ASTM Customer Service at service@astm.org. For Annual Book of ASTM Standards volume information, refer to the standard’s Document Summary page on the ASTM website.

6 Withdrawn. 7 Available from American National Standards Institute (ANSI), 25 W. 43rd St.,

4th Floor, New York, NY 10036, http://www.ansi.org.

1

*A Summary of Changes section appears at the end of this standard.

Copyright © ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959, United States.

Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

– – ` , , ` ` ` , , , , ` ` ` ` – ` – ` , , ` , , ` , ` , , ` – – –

//^:^^#^~^^”~@ :^”^:*^@

$~”#:*~^^*:@ *^^^~::^~:^*#^^””$”\\

 

 

such as the U.S. National Institute of Standards and Technol- ogy (NIST), or (3) a consensus value, based on collaborative experimental work under the auspices of a scientific or engineering group. E 456

3.1.2 accuracy, n—the closeness of agreement between a test result and an accepted reference value. E 456

3.1.3 audit, n—a systematic examination of a laboratory’s quality system procedure and related activities by an internal or external team to determine whether these procedures or activi- ties are implemented according to the documented system.

3.1.4 bias, n—the difference between the population mean of the test results and an accepted reference value. E 456

3.1.5 calibration standard, n—a material with a certified value for a relevant property, issued by or traceable to a national organization such as NIST, and whose properties are known with sufficient accuracy to permit its use to evaluate the same property of another sample.

3.1.6 certified reference material, CRM, n—a reference material one or more of whose property values are certified by a technically valid procedure, accompanied by a traceable certificate or other documentation which is issued by a certi- fying body. ISO Guide 30

3.1.7 measurand, n—the measurable quantity subject to measurement.

3.1.8 outlier, n—a result far enough in magnitude from other results so as to be considered not a part of the set.

D 6300 3.1.9 precision, n—the closeness of agreement between test

results obtained under prescribed conditions. E 456 3.1.10 proficiency testing, n—determination of a laborato-

ry’s testing capability by evaluating its test results in interlabo- ratory exchange testing or crosscheck programs.

3.1.10.1 Discussion—One example is the ASTM D02 com- mittee’s proficiency testing programs in a wide variety of petroleum products and lubricants, many of which may involve more than a hundred laboratories.

3.1.11 quality assurance (QA), n—a system of activities, the purpose of which is to provide to the producer and user of a product, measurement, or service the assurance that it meets the defined standards of quality with a stated level of confi- dence.

3.1.11.1 Discussion—Quality assurance includes quality planning and quality control.

3.1.12 quality control (QC), n—a planned system of activi- ties whose purpose is to provide a level of quality that meets the needs of users; also the uses of such a system.

3.1.13 quality control sample (QC sample), n—for use in quality assurance program to determine and monitor the precision and stability of a measurement system; a stable and homogenous material having physical or chemical properties, or both, similar to those of typical samples tested by the analytical measurement system. The material is properly stored to ensure sample integrity, and is available in sufficient quantity for repeated long-term testing. D 6299

3.1.14 reference material (RM), n—a material with ac- cepted reference value(s), accompanied by an uncertainty at a stated level of confidence for desired properties, which may be used for calibration or quality control purposes in the labora- tory.

3.1.14.1 Discussion—Sometimes these may be prepared “in-house” provided the reference values are established using accepted standard procedures.

3.1.15 repeatability, n—the quantitative expression of the random error associated with a single operator in a given laboratory obtaining repetitive results with the same apparatus under constant operating conditions on identical test material. It is defined as the difference between two such results at the 95 % confidence level. D 6300

3.1.16 reproducibility, n—a quantitative expression of the random error associated with different operators using different apparatus, and so forth, each obtaining a single result on an identical test sample when applying the same method. It is then defined as the 95 % confidence limit for the difference between two such single and independent results. D 6300

3.1.17 site precision (R8), n—the value below which the absolute difference between two individual test results obtained under site precision conditions may be expected to occur with a probability of approximately 0.95 (95 %). It is defined as 2.77 times the standard deviation of results obtained under site precision conditions. D 6299

3.1.18 site precision conditions, n—conditions under which test results are obtained by one or more operators in a single site location practicing the same test method on a single measurement system using test specimens taken at random from the same sample of material over an extended period of time spanning at least a 15 day interval. D 6299

3.1.19 traceability, n—property of the result of a measure- ment or the value of a standard whereby it can be related to stated references, usually national or international standards, through an unbroken chain of comparisons all having stated uncertainties.

3.2 Definitions of Terms Specific to This Standard: 3.2.1 precision ratio (PR), n—an estimate of relative mag-

nitude of repeatability and reproducibility. The PR for a given standard test method can provide information on the relative significance between variation caused by different operators and laboratories compared to a single operator in a laboratory performing the standard test method.

3.2.2 test performance index (TPI), n—an approximate measure of a laboratory’s testing capability, defined as the ratio of test method reproducibility to site precision.

3.3 Acronyms: 3.3.1 NIST—National Institute of Standards and Technol-

ogy (formerly called National Bureau of Standards), Gaithers- burg, MD.

4. Significance and Use

4.1 A petroleum products and lubricants testing laboratory plays a crucial role in product quality management and customer satisfaction. It is essential for a laboratory to provide quality data. This document provides guidance for establishing and maintaining a quality system in a laboratory.

D 6792 – 07

2Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

– – ` , , ` ` ` , , , , ` ` ` ` – ` – ` , , ` , , ` , ` , , ` – – –

//^:^^#^~^^”~@ :^”^:*^@

$~”#:*~^^*:@ *^^^~::^~:^*#^^””$”\\

 

 

5. General Quality Requirements for the Laboratory

5.1 Establishment and maintenance of a quality system shall include stated objectives in the following areas: a laboratory’s adherence to test method requirements, calibration and main- tenance practices, and its quality control program. Laboratory quality objectives should encompass the laboratory’s continu- ous improvement goals as well as meeting customer require- ments.

5.2 Management shall appoint a representative to imple- ment and maintain the quality system in the laboratory.

5.3 Laboratory management shall review the adequacy of the quality system and the activities of the laboratory for consistency with the stated quality objectives at least annually.

5.4 The quality system shall have documented processes for:

5.4.1 Sample management (see Section 6), 5.4.2 Data and record management (see Section 7), 5.4.3 Producing accurate, reliable, and properly represented

test results (see Section 8), 5.4.4 Audits and proficiency testing (see Section 9), 5.4.5 Corrective and preventive action (see Section 11), 5.4.6 Ensuring that procured services and materials meet the

contracted requirements, and 5.4.7 Ensuring that personnel are adequately trained to

obtain quality results.

6. Sample Management

6.1 The elements of sample management shall include at a minimum:

6.1.1 Procedures for unique identification of samples sub- mitted to the laboratory.

6.1.2 Criteria for sample acceptance. 6.1.3 Procedures for sample handling. 6.1.4 Procedures for sample storage and retention. Items to

consider when creating these procedures include: 6.1.4.1 Applicable government—local, state, or national—

regulatory requirements for shelf life and time-dependent tests that set product stability limits,

6.1.4.2 Type of sample containers required to preserve the sample,

6.1.4.3 Control of access to the retained samples to protect their validity and preserve their original integrity,

6.1.4.4 Storage conditions, 6.1.4.5 Required safety precautions, and 6.1.4.6 Customer requirements. 6.1.5 Procedures for sample disposal in accordance with

applicable government regulatory requirements.

NOTE 2—This may be handled through a separate chemical hygiene or waste disposal plan.

7. Data and Record Management

7.1 Reports of Analysis: 7.1.1 The work carried out by a laboratory shall be covered

by a certificate or report that accurately and unambiguously presents the test results and all other relevant information.

NOTE 3—This report may be an entry in a Laboratory Information Management System (LIMS) or equivalent system.

7.1.2 The following items are suggested for inclusion in laboratory reports:

7.1.2.1 Name and address of the testing laboratory, 7.1.2.2 Unique identification of the report (such as serial

number) on each page of the report, 7.1.2.3 Name and address of the customer, 7.1.2.4 Order number, 7.1.2.5 Description and identification of the test sample, 7.1.2.6 Date of receipt of the test sample and date(s) of

performance of test, as appropriate, 7.1.2.7 Identification of the test specification, method, and

procedure, 7.1.2.8 Description of the sampling procedure, where rel-

evant, 7.1.2.9 Any deviations, additions to or exclusions from the

specified test requirements, and any other information relevant to a specific test,

7.1.2.10 Disclosure of any nonstandard test method or procedure utilized,

7.1.2.11 Measurements, examinations, and derived results, supported by tables, graphs, sketches, and photographs as appropriate, and any failures identified,

7.1.2.12 Minimum-maximum product specifications, if ap- plicable,

7.1.2.13 A statement of the measurement uncertainty (where relevant or required by the customer),

7.1.2.14 Any other information which might be required by the customer,

7.1.2.15 A signature and job title of person(s) accepting technical responsibility for the test report and the date of issue, and

7.1.2.16 A statement on the laboratory policy regarding the reproduction of test reports.

7.1.3 Items actually included in laboratory reports should be specified by laboratory management or agreements with cus- tomers, or both.

7.1.4 Procedures for corrections or additions to a test report after issue shall be established.

7.2 Reporting and Rounding the Data: 7.2.1 The reporting requirements specified in the test

method or procedure shall be used (unless specifically required otherwise by the customer or applicable regulations).

7.2.2 If rounding is performed, the rounding protocol of Practice E 29 should be used unless otherwise specified in the method or procedure.

7.3 Records of Calibration and Maintenance: 7.3.1 Procedures shall be established for the management of

instrument calibration records. Such records usually indicate the instrument calibrated, method or procedure used for cali- bration, the dates of last and next calibrations, the person performing the calibration, the values obtained during calibra- tion, and the nature and traceability (if applicable) of the calibration standards (that is, certified values). Records may be electronic.

7.3.2 Procedures shall be established for the management of instrument maintenance records. Such records usually indicate

D 6792 – 07

3Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

– – ` , , ` ` ` , , , , ` ` ` ` – ` – ` , , ` , , ` , ` , , ` – – –

//^:^^#^~^^”~@:^”^:*^@$~”#:*~^^*:@*^^^~::^~:^*#^^””$”\\

 

 

the instrument maintained, the dates of last and next mainte- nance, and the person performing the maintenance. Records may be electronic.

NOTE 4—For instruments that require calibration, calibration and main- tenance records may be combined.

7.4 Quality Control (QC) Testing Records: 7.4.1 The laboratory shall have documented procedures for

creating and maintaining records for analysis of QC samples. It is recommended that such records include the sample name and source, the test(s) for which it is to be used, the assigned values and their uncertainty where applicable, and values obtained upon analysis. Additionally, it is recommended that the receipt date or date put into active QC use in the laboratory be documented, along with the expiration date (if applicable).

7.4.2 Procedures for retaining completed control charts should be established. It is recommended that these records include the date the control charts were changed and the reason for the change.

7.5 Record Retention: 7.5.1 The record system should suit the laboratory’s particu-

lar circumstances and comply with any existing regulations and customer specifications.

7.5.2 All data shall be maintained according to laboratory, company, or regulatory agency requirements, or a combination thereof.

7.5.3 Procedures for retaining records of all original obser- vations, calculations and derived data, calibration records, and final test reports for an appropriate period shall be established. The records for each test should contain sufficient information to permit satisfactory replication of the test and recalculation of the results.

7.5.4 The records shall be held in a safe and secure storage. A system shall exist that allows locating the required docu- ments in a reasonable period of time.

8. Producing Accurate, Reliable, and Properly Represented Test Results

8.1 The laboratory shall have documented test methods and procedures for performing the required tests. These shall be maintained up-to-date and be readily available to the laboratory staff. The test methods that are stated in the product specifica- tions or agreed upon with customers should be used for sample analysis.

8.2 The laboratory shall have procedures for the approval, documentation, and reporting of deviations from the test method requirements or the use of alternative methods.

8.3 Procedures shall be established to ensure that measuring and testing equipment is calibrated, maintained properly, and is in statistical control. Items to consider when creating these procedures include:

8.3.1 Records of calibration and maintenance (see 7.3), 8.3.2 Calibration and maintenance schedule,

NOTE 5—The calibration frequency may vary with the instrument type and its frequency of use, some needing calibration before each set of analyses, others requiring calibration at less frequent periods, or triggered by a QC chart out-of-statistical-control situation.

8.3.3 Traceability to national or international standards,

NOTE 6—Where the concept of traceability to national or international standards of measurement is not applicable, the testing laboratory shall provide satisfactory evidence of test result accuracy (for example, by participation in a program of interlaboratory comparisons).

8.3.4 Requirements of the test method or procedure, 8.3.5 Customer requirements, and 8.3.6 Corrective actions (see Section 11). 8.4 The performance of apparatus and equipment used in

the laboratory but not calibrated in that laboratory (that is, pre-calibrated, vendor supplied) should be verified by using a documented, technically valid procedure at periodic intervals.

8.5 Calibration standards shall be appropriate for the method and characterized with the accuracy demanded by the analysis to be performed. Quantitative calibration standards should be prepared from constituents of known purity. Use the primary calibration standards or CRMs specified or allowed in the test method.

8.5.1 Where appropriate, values for reference materials should be produced following the certification protocol used by NIST8,9,10 or other standards issuing bodies, and, should be traceable to national or international standard reference mate- rials, if required or appropriate.

8.5.2 The materials analyzed in proficiency testing pro- grams meeting the requirements of Practice D 6300 or ISO 4259 may be used as reference materials, provided no obvious bias or unusual frequency distribution of results are observed. The consensus value is most likely the value closest to the true value of this material; however, the uncertainty attached to this mean value will be dependent on the precision and the total number of the participating laboratories.

8.6 The laboratory shall establish procedures for the storage of reference materials in a manner to ensure their safety, integrity, and protection from contamination (see 6.1.4).

8.7 Records of instrument calibration shall be maintained (see Section 7).

8.8 If an instrument is found to be out of calibration, and the situation cannot be immediately addressed, then the instrument shall be taken out of operation and tagged as such until the situation is corrected (see Section 11).

8.9 Quality Control Practices: 8.9.1 Use appropriate quality control charts or other quality

control practices (for example, like those described in Practice D 6299) for each test method performed by the laboratory unless specifically excluded. Document cases where quality control practices are not employed and include the rationale.

8.9.2 This practice advocates the regular testing of quality control samples with timely interpretation of test results. This practice also advocates using appropriate control charting techniques to ascertain the in-statistical-control status of test methods in terms of precision, bias (if a standard is being used), and method stability over time. For details concerning QC sample requirements and control charting techniques, refer to Practice D 6299. The generally accepted practices are outlined in 8.9.3 through 8.12.4.

8 Cali, J. P., Anal. Chem., 48, 802A, 1976. 9 Uriano, G. A., and Gravatt, C. C., CRC Crit. Revs, in Anal. Chem., 6, 361, 1977. 10 Alvarez, R., Rasberry, S. D., and Uriano, G. A., Anal. Chem., 54, 1226A, 1982.

D 6792 – 07

4Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

–`,,“`,,,,““-`-`,,`,,`,`,,`—

//^:^^#^~^^”~@:^”^:*^@$~”#:*~^^*:@*^^^~::^~:^*#^^””$”\\

 

 

8.9.3 Test QC samples on a regular schedule. Principal factors to be considered for determining the frequency of testing include: (1) frequency of use of the analytical measure- ment system, (2) criticality of the parameter being measured and business economics, (3) established system stability and precision performance based on historical data, (4) regulatory requirements, (5) contractual provisions, and (6) test method requirements.

8.9.3.1 If site precision for a specific test has not been established as defined by Practice D 6299, then the recom- mended frequency for analysis of QC samples is one QC out of every ten samples analyzed. Alternatively, one QC sample is analyzed each day that samples are analyzed, whichever is more frequent.

8.9.3.2 Once the site precision has been established as defined by Practice D 6299, and to ensure similar quality of data is achieved with the documented method, the minimal QC frequency may be adjusted based on the Test Performance Index (TPI) and the Precision Ratio (PR).

(1) For standard test methods with PR (as defined in 10.2) less than 4 and a TPI (as defined in 10.1) less than 0.8, consult 10.3 and the Standard Test Method for appropriate corrective action.

(2) For standard test methods with PR (as defined in 10.2) greater than or equal to 4 and a TPI (as defined in 10.1) less than 1.6, consult 10.3 and the Standard Test Method for appropriate corrective action.

8.9.3.3 Table 1 provides recommended minimal QC fre- quencies as a function of PR and TPI. For those tests, which are performed infrequently, for example less than 25 samples are analyzed monthly, it is recommended that at least one QC sample be analyzed each time samples are analyzed.

8.9.3.4 In many situations, the minimal QC frequency as recommended by Table 1 may not be sufficient to ensure adequate statistical quality control, considering, for example, the significance of use of the results. Hence, it is recommended that the flowchart in Fig. 1 be followed to determine if a higher QC frequency should be used.

8.9.3.5 The TPI should be recalculated and reviewed at least annually. Adjustments to QC frequency should be made based on the recalculated TPI by following sections 8.9.3.2 and 8.9.3.4.

8.9.4 QC testing frequency, QC samples, and their test values shall be recorded.

8.9.5 All persons who routinely operate the system shall participate in generating QC test data. QC samples should be treated as regular samples.

NOTE 7—Avoid special treatment of QC samples designed to “get a better result.” Special treatment seriously undermines the integrity of precision and bias estimates.

8.9.6 The laboratory may establish random or blind testing, or both, of QC or other known materials.

8.10 Quality Control Sample and Test Data Evaluation: 8.10.1 QC samples should be stable and homogeneous

materials having physical or chemical properties, or both, representative of the actual samples being analyzed by the test method. This material shall be well-characterized for the analyses of interest, available in sufficient quantities, have concentration values that are within the calibration range of the test method, and reflect the most common values tested by the laboratory. For QC testing that is strictly for monitoring the test method stability and precision, the QC sample expected value is the control chart centerline, established using data obtained under site precision conditions. For regular QC testing that is intended to assess test method bias, RMs, or CRMs with independently assigned ARVs should be used. The results should be assessed in accordance with Practice D 6299 require- ments for check standard testing. For infrequent QC testing for bias assessment, refer to Practice D 6617.

NOTE 8—It is not advisable to use the same sample for both a calibrant and a QC sample. It is not advisable to use the same chemical lot number for both a calibrant and a QC sample.

8.10.2 If the QC material is observed to be degrading or changing in physical or chemical characteristics, this shall be immediately investigated and, if necessary, a replacement QC material shall be prepared for use.

NOTE 9—In a customer-supplier quality dispute, it may be beneficial to provide the customer with the laboratory’s test results on QC material to demonstrate testing proficiency. Practice D 3244 may be useful.

8.11 Quality Control Charts: 8.11.1 QC sample test data should be promptly plotted on a

control chart and evaluated to determine if the results obtained are within the method specifications and laboratory-established control limits. The charts used should be appropriate for the testing conditions and statistical objectives. Corrective action should be taken and documented for any analyses that are out-of-control (see Section 11).

NOTE 10—Charts such as individual, moving average and moving range, exponentially weighted moving average, or cumulative summation charts may be used as appropriate. Refer to Practice D 6299 for guidance on plotting these charts.

8.11.1.1 The charts should indicate the test method, date when the QC analyses were performed, and who performed them. Test samples should not be analyzed or results for samples should not be reported until the corresponding QC data are assessed and the testing process is verified to be in statistical control. (See 8.9.)

8.11.2 Adequate training should be given to the analysts to enable them to generate and interpret the charts.

8.11.3 It is suggested that the charts be displayed promi- nently near the analysis workstation, so that all can view and, if necessary, help in improving the analyses.

8.11.4 Supervisory and technical personnel should periodi- cally review the QC charts.

TABLE 1 Minimal QC Frequency as a Function of Test Performance Index

TPI for Standard

Test Methods with PR<4

TPI for Standard

Test Methods with PR$4

Nominal QC Frequency (1 QC out of every

X Samples) Values of X

Approximate Percentage

of QC Samples/ Total Analyses

Not determined Not determined 10 9 <0.8 <1.6 10 9

0.8–1.2 1.6–2.4 20 5 1.2–2.0 2.4–4.0 35 3

>2.0 >4.0 40 2

D 6792 – 07

5Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

– – ` , , ` ` ` , , , , ` ` ` ` – ` – ` , , ` , , ` , ` , , ` – – –

//^:^^#^~^^”~@ :^”^:*^@

$~”#:*~^^*:@ *^^^~::^~:^*#^^””$”\\

 

 

FIG. 1 Flowchart for QC Frequency

D 6792 – 07

6Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

– – ` , , ` ` ` , , , , ` ` ` ` – ` – ` , , ` , , ` , ` , , ` – – –

//^:^^#^~^^”~@:^”^:*^@$~”#:*~^^*:@*^^^~::^~:^*#^^””$”\\

 

 

8.11.5 The laboratory should establish written procedures outlining the appropriate interpretation of QC charts and responses to out-of-statistical-control situations observed.

8.11.5.1 When an out-of-statistical-control situation has been identified, remedial action should be taken before analyz- ing further samples. In all such cases, run the QC sample and ensure that a satisfactory result can be obtained before analyz- ing unknown samples.

NOTE 11—A generic checklist for investigating the root cause of unsatisfactory analytical performance is given in Appendix X1.

8.11.6 Out-of-control situations may be detected by one or more analyses. In these cases, it may be necessary to retest samples analyzed during the period between the last in-control QC data point and the QC data point that triggered the out-of-statistical-control notice (or event) using retained samples and equipment known to be in control. If the new analysis shows a difference that is statistically different from the original results, and the difference exceeds the established site precision of that test, the laboratory should decide on what further actions are necessary (see Section 11).

8.12 Revision of Control Charts—QC chart revision is covered in detail in Practice D 6299. Control charts shall be revised only when the existing limits are no longer appropriate. As a guideline, revisions may be needed when:

8.12.1 Additional information becomes available, 8.12.2 The process has improved, 8.12.3 A new QC material is initiated and the mean value is

different than the previous QC material, or 8.12.4 There are major changes to the test procedure.

9. Audits and Proficiency Testing

9.1 Audits: 9.1.1 A laboratory shall have a system to periodically

review its own practices to confirm continued conformance to the laboratory’s documented quality system. Even if the laboratory is subjected to a formal external audit (for example, as a requirement of ANSI/ISO/ASQ Q9000), it is important to have internal audits since the internal reviewers may be more familiar with their laboratory’s requirements than the external auditors.

9.1.2 Audits of test methods should be conducted to confirm adherence to the documented test methods. The performance of the entire test should be observed and checked against the official specified test method. An annual audit of test methods is recommended.

NOTE 12—These audits may be part of the quality system audits or may be separate.

9.1.3 Audit results shall be promptly documented. The team shall report the audit results to management having the authority and responsibility to take corrective action and to its management.

9.1.4 The findings and recommendations of these internal audits shall be reviewed by the laboratory management and acted upon to correct the deficiencies or nonconformances.

9.1.5 The effectiveness of any corrective actions taken in response to an audit shall be verified. The follow-up results shall be documented as required by the quality system proce- dures or laboratory policy, or both.

9.2 Proficiency Testing: 9.2.1 Regular participation in interlaboratory proficiency

testing programs, where appropriate samples are tested by multiple test facilities using a specified test protocol, shall be integrated into the laboratory’s quality control program. Pro- ficiency test programs should be used as appropriate by the laboratory to demonstrate testing proficiency relative to other industry laboratories.

NOTE 13—Document the rationale for not participating in a proficiency test program.

9.2.2 The laboratory shall establish criteria for guiding their participation in interlaboratory testing programs. Such criteria may include factors such as the frequency of use of the target test method, the critical nature of how the customer uses the data, and regulatory considerations. Participation in proficiency test programs can provide a cost-effective alternative to regular CRM testing.

9.2.3 Participants may plot their deviations from the con- sensus values established by the proficiency test program averages on a control chart to ascertain if their measurement processes are non-biased. The precision of these exchange performance data can also be assessed against precision established by in-house QC sample testing for consistency (see Practice D 6299 for details).

9.2.4 Participation in proficiency testing shall not be con- sidered as a substitute for in-house quality control, as described in 8.9, and vice versa.

10. Test Method Precision Performance Assessment

10.1 The test performance index (TPI) can be used to compare the precision of the laboratory measurements with the published reproducibility of a standard test method. The term TPI is defined as:

test performance index 5 test method reproducibility

site precision (1)

NOTE 14—The ASTM International Committee D02 sponsored Inter- laboratory Crosscheck Program employs a test performance index based on the ratio of the published ASTM reproducibility to the Robust Reproducibility calculated from the program data. This index is termed the TPI (Industry) to distinguish from the definition in 10.1.

10.2 A precision ratio (PR) is determined for a given published test method so that the appropriate action criteria may be applied for a laboratory’s TPI. The PR for a published test method estimates the influence that non-site specific variations has on the published precision. The PR can be calculated by dividing the test method’s Reproducibility by the repeatability as shown in Eq 2.

Precision Ratio, PR 5 Test Method reproducibility ~R!

Test Method repeatability ~r! (2)

where the ratio of R/r is calculated to the nearest integer (that is, 1, 2, 3, 4, …).

10.2.1 A test method with PR greater than or equal to 4, for the purpose of this practice, is deemed to exhibit a significant difference between repeatability and reproducibility. For fur- ther explanation on why the greater than or equal to 4 criterion was chosen, please see Appendix X3.

D 6792 – 07

7Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

– – ` , , ` ` ` , , , , ` ` ` ` – ` – ` , , ` , , ` , ` , , ` – – –

//^:^^#^~^^”~@:^”^:*^@$~”#:*~^^*:@*^^^~::^~:^*#^^””$”\\

 

 

10.3 A laboratory’s TPI may be a function of the sample type being analyzed and variations associated with that labo- ratory. As general guidelines Table 2 may be used once the TPI of that laboratory and the PR of the published standard test method has been calculated. Similar information to that pro- vided in Table 2 is provided in 10.3.1 through 10.3.2.3.

10.3.1 For a published standard test method with a PR less than 4 the following TPI criteria should be applied.

10.3.1.1 A TPI greater than 1.2 indicates that the perfor- mance is probably satisfactory relative to ASTM published precision.

10.3.1.2 A TPI greater than or equal to 0.8 and less than or equal to 1.2 indicated performance may be marginal and the laboratory should consider method review for improvement.

10.3.1.3 A TPI less than 0.8 suggests that the method as practiced at this site is not consistent with the ASTM published precision. Either laboratory method performance improvement is required, or ASTM published precision does not reflect achievable precision. Existing interlaboratory exchange perfor- mance (if available) should be reviewed to determine if the latter is plausible.

10.3.2 For a published standard test method with a PR greater than or equal to 4 the following TPI criteria should be applied.

10.3.2.1 A TPI greater than 2.4 indicates that the perfor- mance is probably satisfactory relative to ASTM published precision.

10.3.2.2 A TPI greater than or equal to 1.6 and less than or equal to 2.4 indicated performance may be marginal and the laboratory should consider method review for improvement.

10.3.2.3 A TPI less than 1.6 suggests that the method as practiced at this site is not consistent with the ASTM published precision. Either laboratory method performance improvement is required, or ASTM published precision does not reflect precision achievable. Existing interlaboratory exchange perfor- mance (if available) should be reviewed to determine if the latter is plausible.

10.3.3 A laboratory may choose to set other benchmarks for TPI, keeping in mind that site precision of an adequately performing laboratory cannot, in the long run, exceed the practically achievable reproducibility of the method when PR is less than 4 or approaches repeatability when PR is much greater than 4.

NOTE 15—Experience has shown, for some methods, published repro- ducibility is not in good agreement with the precision achieved by participants in well-managed crosscheck programs. Users should consider this fact when evaluating laboratory performance using TPI.

10.4 A laboratory should review their precision obtained for multiple analyses on the same sample. The site precision of the QC samples can be compared with the reproducibility or repeatability given in the standard test methods to indicate how well a laboratory is performing against the industry standards.

10.5 A laboratory precision significantly worse than the published test method reproducibility may indicate poor per- formance. An investigation should be launched to determine the root cause for this performance so that corrective action can be undertaken if necessary. Such a periodic review is a key feature of a laboratory’s continuous improvement program.

11. Corrective and Preventive Action

11.1 The need for corrective and preventive action may be indicated by one or more of the following unacceptable situations:

11.1.1 Equipment out of calibration, 11.1.2 QC or check sample result out of control, 11.1.3 Test method performance by the laboratory does not

meet performance criteria (for example, precision, bias, and the like) documented in the test method,

11.1.4 Product, material, or process out of specification data,

11.1.5 Outlier or unacceptable trend in an interlaboratory cross-check program,

11.1.6 Nonconformance identified in an external or internal audit,

11.1.7 Nonconformance identified during review of labora- tory data or records,

11.1.8 Customer complaint. 11.2 When any of these situations occur, the root cause

should be investigated and identified. Procedures for investi- gating root cause should be established. Items to consider when creating these procedures include:

11.2.1 Determining when the test of equipment was last known to be in control,

11.2.2 Identifying results that may have been adversely affected,

TABLE 2 Guidelines for Action Based on TPI

TPI for Standard Test Methods

with PR<4

TPI for Standard Test Methods

with PR$4

Recommended Quality Improvement Action

>1.2 >2.4 Indicates that the performance is probably satisfactory relative to ASTM published precision.

>0.8 and <1.2 >1.6 and <2.4 Indicates that the performance is probably satisfactory relative to ASTM published precision, however a method review could be necessary to improve its performance.

<0.8 <1.6 This condition suggests that the method as practiced at this site is not consistent with the ASTM published precision. Either laboratory method performance improvement is required, or the ASTM published precision does not reflect precision achievable. Existing interlaboratory exchange performance (if available) should be reviewed to determine if the latter is plausible.

D 6792 – 07

8Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

– – ` , , ` ` ` , , , , ` ` ` ` – ` – ` , , ` , , ` , ` , , ` – – –

//^:^^#^~^^”~@ :^”^:*^@

$~”#:*~^^*:@ *^^^~::^~:^*#^^””$”\\

 

 

11.2.3 How to handle affected results already reported to a customer,

11.2.4 What to do if the root cause cannot be determined, and

11.2.5 What to do if it is determined that the original data is correct.

11.2.6 It is possible that the analytical results are correct, even if they don’t meet specifications. Procedures should consider this possibility. See Appendix X1 for a checklist for investigating the root cause of unsatisfactory analytical perfor- mance.

11.3 Procedures should also be established for the identi- fication and implementation of appropriate corrective and preventive action so that the situation does not reoccur. This may involve:

11.3.1 Training or retraining personnel, 11.3.2 Reviewing customer specifications, 11.3.3 Reviewing test methods and procedures, 11.3.4 Establishing new or revised procedures, 11.3.5 Instrument maintenance and repair, 11.3.6 Re-preparation of reagents and standards, 11.3.7 Recalibration of equipment, 11.3.8 Re-analysis of samples, and 11.3.9 Additional QC sample analysis. 11.3.10 The situation, root cause, and corrective/preventive

action taken should be documented promptly. A corrective and preventive action report is a suitable format for documentation.

11.3.11 The report should be reviewed and approved by management and then verified for effectiveness of corrective/ preventive action.

11.4 Quality control charts (see 8.11) are a method of preventive action and should be evaluated on a regular basis to prevent, when possible, out-of-statistical-control situations.

12. Customer Complaints

12.1 A procedure shall exist to follow-up on customer complaints or non-conformances brought to the laboratory’s attention by a client. The result of such investigation should be communicated to the customer as soon as practical.

13. Training

13.1 Laboratory management shall ensure that all staff performing testing or interpreting data, or both, are appropri- ately trained.

13.2 Laboratory training should cover at a minimum the following areas: safety, test methods, and company policies

and procedures. Training is specifically required as specified in: 5.4.7, 8.11.2, 11.3, and X1.1.12.

13.3 Records of training should be maintained.

14. Relationship with Other Quality Standards

14.1 Some laboratories in the petrochemicals testing area have been registered to ISO/IEC 17025. There are a number of similarities between the ISO standard and this practice in key areas of managing laboratory quality. For example:

Requirement ISO/IEC 17025

ASTM Practice D 6792

Quality System 4.2 5.1 Document Control 4.3 8.1; 8.2 Contract Review 4.4 5.4.6

Complaints 4.8 12.1 Corrective Action 4.10 11; Appendix

X1 Preventive Action 4.11 11.4

Control of Records 4.12 7.3.1; 7.4; 7.5 Internal Audits 4.13 9.1

Management Reviews 4.14 5.3 Personnel 5.2 5.4.7, 13.1,

13.2 Calibration 5.6.2.1 8.3–8.8

Sample Handling 5.8 6.1 Quality Control Procedures 5.9 8.9

Use of Quality Control Materi- als

5.9.a 8.10

Proficiency Testing 5.9.b 9.2 Data Reports 5.10 7.1

14.2 Measurement Uncertainty—For test methods under the jurisdiction of Committee D02, measurement uncertainty as required in ISO/IEC 17025, as practiced by a laboratory, can be estimated by multiplying 23 the site precision standard devia- tion as defined in Practice D 6299.

NOTE 16—The complexity and empirical nature of the majority of D02 methods preclude the application of rigorous measurement uncertainty algorithms. In many cases, interactions between the test method variables and the measurand cannot be reasonably estimated due to the covariance of the variables that affect the measurand. The site precision approach estimates the combined effects of these variables on the total uncertainty for the measurand.

NOTE 17—The methodology of using site precision established using QC materials and control charts to estimate measurement uncertainty assumes that the laboratory is unbiased. This assumption should be validated periodically using check standards. See Practice D 6617 or Practice D 6299 for further guidance.

15. Keywords

15.1 audit; calibration; control charts; proficiency testing; quality assurance; quality control; test performance index

D 6792 – 07

9Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

–`,,“`,,,,““-`-`,,`,,`,`,,`—

//^:^^#^~^^”~@:^”^:*^@$~”#:*~^^*:@*^^^~::^~:^*#^^””$”\\

 

 

APPENDIXES

(Nonmandatory Information)

X1. CHECKLIST FOR INVESTIGATING THE ROOT CAUSE OF UNSATISFACTORY ANALYTICAL PERFORMANCE

X1.1 To identify why a laboratory’s data may have been considered a statistical outlier or to improve the precision, or both, the following action items (not necessarily in the order of preference) are suggested. There may be additional ways to improve the performance.

X1.1.1 Check the results for typos, calculation errors, and transcription errors.

X1.1.2 Reanalyze the sample; compare to site precision, or, if not available, test method repeatability.

X1.1.3 Check the sample for homogeneity or contamina- tion, and that a representative sample has been analyzed.

X1.1.4 Review the test method and ensure that the latest version of the ASTM test method is being used. Check the procedure step-by-step with the analyst.

X1.1.5 Check the instrument calibration.

X1.1.6 Check the statistical quality control chart to see if the problem has been developing earlier.

X1.1.7 Check the quality of the reagents and standards used, and whether they are expired or contaminated.

X1.1.8 Check the equipment for proper operation against the vendor’s operating manual.

X1.1.9 Perform maintenance or repairs, or both, on the equipment following guidelines established by the vendor.

X1.1.10 After the problem has been resolved, analyze a certified reference material if one is available, or the laboratory quality control sample, to ascertain that the analytical operation is under control.

X1.1.11 Provide training to new analysts and, if necessary, refresher training to experienced analysts.

X1.1.12 Document the incident and the learnings for use in the future if a similar problem occurs.

X2. SELF-ASSESSMENT CHECKLIST TO EVALUATE COMPLIANCE WITH PRACTICE D 6792

X2.1 See the checklist in Fig. X2.1.

D 6792 – 07

10Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

– – ` , , ` ` ` , , , , ` ` ` ` – ` – ` , , ` , , ` , ` , , ` – – –

//^:^^#^~^^”~@:^”^:*^@$~”#:*~^^*:@*^^^~::^~:^*#^^””$”\\

 

 

FIG. X2.1 Self-Assessment Checklist to Evaluate Compliance with Practice D 6792

D 6792 – 07

11Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

– – ` , , ` ` ` , , , , ` ` ` ` – ` – ` , , ` , , ` , ` , , ` – – –

//^ :^

^# ^~

^^ “~

@ :^

“^ :*

^@ $~

“# :*

~^ ^*

:@ *^

^^ ~:

:^ ~:

^* #^

^” “$

“\\

 

 

FIG. X2.1 Self-Assessment Checklist to Evaluate Compliance with Practice D 6792 (continued)

D 6792 – 07

12Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

–`,,“`,,,,““-`-`,,`,,`,`,,`—

//^:^^#^~^^”~@ :^”^:*^@

$~”#:*~^^*:@ *^^^~::^~:^*#^^””$”\\

 

 

FIG. X2.1 Self-Assessment Checklist to Evaluate Compliance with Practice D 6792 (continued)

D 6792 – 07

13Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

–`,,“`,,,,““-`-`,,`,,`,`,,`—

//^ :^

^# ^~

^^ “~

@ :^

“^ :*

^@ $~

“# :*

~^ ^*

:@ *^

^^ ~:

:^ ~:

^* #^

^” “$

“\\

 

 

FIG. X2.1 Self-Assessment Checklist to Evaluate Compliance with Practice D 6792 (continued)

D 6792 – 07

14Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

– – ` , , ` ` ` , , , , ` ` ` ` – ` – ` , , ` , , ` , ` , , ` – – –

//^:^^#^~^^”~@:^”^:*^@$~”#:*~^^*:@*^^^~::^~:^*#^^””$”\\

 

 

FIG. X2.1 Self-Assessment Checklist to Evaluate Compliance with Practice D 6792 (continued)

D 6792 – 07

15Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

– – ` , , ` ` ` , , , , ` ` ` ` – ` – ` , , ` , , ` , ` , , ` – – –

//^ :^

^# ^~

^^ “~

@ :^

“^ :*

^@ $~

“# :*

~^ ^*

:@ *^

^^ ~:

:^ ~:

^* #^

^” “$

“\\

 

 

FIG. X2.1 Self-Assessment Checklist to Evaluate Compliance with Practice D 6792 (continued)

D 6792 – 07

16Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

–`,,“`,,,,““-`-`,,`,,`,`,,`—

//^:^^#^~^^”~@:^”^:*^@$~”#:*~^^*:@*^^^~::^~:^*#^^””$”\\

 

 

FIG. X2.1 Self-Assessment Checklist to Evaluate Compliance with Practice D 6792 (continued)

D 6792 – 07

17Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

–`,,“`,,,,““-`-`,,`,,`,`,,`—

//^ :^

^# ^~

^^ “~

@ :^

“^ :*

^@ $~

“# :*

~^ ^*

:@ *^

^^ ~:

:^ ~:

^* #^

^” “$

“\\

 

 

X3. COMPARISON OF REPEATABILITY, REPRODUCIBILITY, PRECISION RATIO AND VARIANCE RATIO TEST OF VARIOUS ROUND ROBINS

X3.1 Practice D 6300, subsection A1.7 on Variance Ratio Test (F-Test), provides a detailed discussion of how to deter- mine when significant bias exists for two data sets using the variance ratio. Both the variance ratio, F value and precision ratio, PR were calculated for 38 round robin data sets. Generally, the correlation between F and R/r is not statistically significant to suggest that PR could be used to accurately predict the existence of laboratory-laboratory bias for a given test method. However, this practice is not intended as a detailed statistical analysis of bias between laboratories, rather, the purpose of this practice is to provide some general guidelines for assessing the performance on a laboratory.

X3.2 Generally, for a typical ASTM test method (for example, a typical number of laboratories, six or more, and a typical number of samples studied, ten or more) a F value of 5 or greater exceeds the 5 % critical value given in Practice D 6300, Table A1.6 on critical 5 % values of F, suggesting a bias exists between the laboratories. In addition, when the PR value is equal to or greater than 4, the F value is greater than 5. This suggests that some laboratory bias may exist in the test method’s reproducibility statement. This observation was the rationale for selecting equal to or greater than 4 as the criterion for switching to more severe performance assessment criteria.

X3.3 The relationship of repeatability, Reproducibility and Site Precision as it relates to performance assessment criteria of a test method with PR<4 for a laboratory is represented in Fig. X3.1. This figure illustrates that a laboratory may have a site precision less than Reproducibility and is similar in magnitude to the published method’s repeatability.

X3.4 In Fig. X3.2, there is a similar relationship of repeatability, Reproducibility and Site Precision for a test method with PR>4 as shown in Fig. X3.1. However, the illustration shown in Fig. X3.2 has performance assessment criteria for when PR<4 and PR>4 applied to demonstrate the difference between these two criteria.

X3.4.1 Reviewing Fig. X3.2, a laboratory may have a site precision similar to the test method’s reproducibility, that is significantly greater than the published methods repeatability, but based on the PR<4 performance assessment criteria, is still considered to be generating acceptable results. Using the PR>4 performance criterion forces acceptable site precision to be more evenly distributed between repeatability and reproduc- ibility so that a more thorough review of the lab performance may be assessed.

FIG. X3.1 Comparison of Reproducibility, Repeatability and TPI Guideliens for Action for a Test Method with PR<4

D 6792 – 07

18Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

–`,,“`,,,,““-`-`,,`,,`,`,,`—

//^ :^

^# ^~

^^ “~

@ :^

“^ :*

^@ $~

“# :*

~^ ^*

:@ *^

^^ ~:

:^ ~:

^* #^

^” “$

“\\

 

 

SUMMARY OF CHANGES

Subcommittee D02.94 has identified the location of selected changes to this standard since the last issue (D 6792–06) that may impact the use of this standard. (Approved July 1, 2007.)

(1) Changed type of standard from Guide to Practice through- out. (2) Revised 6.1.1, 7.1.2.13, 7.2.1, 8.6, 8.11.5, and 11.1.3.

(3) Added 8.9.1, 9.2.1, 9.2.2, Section 12, Section 13, and Note 13 and Note 14.

Subcommittee D02.94 has identified the location of selected changes to this standard since the last issue (D 6792–05) that may impact the use of this standard. (Approved May 1, 2006.)

(1) Added 3.2.1. (2) Revised throughout 8.9. (3) Revised Table 1. (4) Revised throughout 10.2.

(5) Added 10.3.1. (6) Added Table 2. (7) Revised Fig. 1. (8) Added Appendix X3.

ASTM International takes no position respecting the validity of any patent rights asserted in connection with any item mentioned in this standard. Users of this standard are expressly advised that determination of the validity of any such patent rights, and the risk of infringement of such rights, are entirely their own responsibility.

This standard is subject to revision at any time by the responsible technical committee and must be reviewed every five years and if not revised, either reapproved or withdrawn. Your comments are invited either for revision of this standard or for additional standards and should be addressed to ASTM International Headquarters. Your comments will receive careful consideration at a meeting of the responsible technical committee, which you may attend. If you feel that your comments have not received a fair hearing you should make your views known to the ASTM Committee on Standards, at the address shown below.

This standard is copyrighted by ASTM International, 100 Barr Harbor Drive, PO Box C700, West Conshohocken, PA 19428-2959, United States. Individual reprints (single or multiple copies) of this standard may be obtained by contacting ASTM at the above address or at 610-832-9585 (phone), 610-832-9555 (fax), or service@astm.org (e-mail); or through the ASTM website (www.astm.org).

FIG. X3.2 Comparison of Reproducibility, Repeatability and TPI PR<4 and PR$4 Guidelines for Action for a Test Method with PR$4

D 6792 – 07

19Copyright ASTM International Provided by IHS under license with ASTM

Not for ResaleNo reproduction or networking permitted without license from IHS

–`,,“`,,,,““-`-`,,`,,`,`,,`—

//^:^^#^~^^”~@:^”^:*^@$~”#:*~^^*:@*^^^~::^~:^*#^^””$”\\

 

  • Scope
  • Referenced Documents
  • Terminology
  • Significance and Use
  • General Quality Requirements for the Laboratory
  • Sample Management
  • Data and Record Management
  • Producing Accurate, Reliable, and Properly Represented Test Results
  • TABLE 1
  • FIG. 1
  • Audits and Proficiency Testing
  • Test Method Precision Performance Assessment
  • Corrective and Preventive Action
  • TABLE 2
  • Customer Complaints
  • Training
  • Relationship with Other Quality Standards
  • Keywords
  • X1. CHECKLIST FOR INVESTIGATING THE ROOT CAUSE OF UNSATISFACTORY ANALYTICAL PERFORMANCE
  • X1.1
  • X2. SELF-ASSESSMENT CHECKLIST TO EVALUATE COMPLIANCE WITH PRACTICE D 6792
  • X2.1
  • FIG. X2.1
  • FIG. X2.1
  • FIG. X2.1
  • FIG. X2.1
  • FIG. X2.1
  • FIG. X2.1
  • FIG. X2.1
  • X3. COMPARISON OF REPEATABILITY, REPRODUCIBILITY, PRECISION RATIO AND VARIANCE RATIO TEST OF VARIOUS ROUND ROBINS
  • X3.1
  • X3.2
  • X3.3
  • X3.4
  • FIG. X3.1
  • FIG. X3.2
Call to Action

Calculate Price


Price (USD)
$