Jump to content

Finnish Environment Institute | Suomen ympäristökeskus | Finlands miljöcentral

sykefi header valkoinen

8 Data Quality Assurance and Management

8.1 Overview of data quality management in the IM programme

8.1.1 General

Many environmental monitoring programmes have failed to achieve their aims because of inadequate data quality management. It is often forgotten that the quality of the data determines the nature of the analyses that can be undertaken, and the quality of the results. Given the subtle nature of many of the changes in ecosystem processes associated with the atmospheric deposition of pollutants, the required data quality is very high.

The general objective of a cooperative international programme to monitor the effects of air pollution on ecosystems requires that all data generated by the various participants should be comparable on an objective basis. It is very important to have a good quality of data, both being consistent in time (in order to assess trends) and space (for the comparisons between different sites and countries). To achieve such comparability, the methods employed to collect materials and to undertake chemical analyses of these must be thoroughly documented. A quality assurance programme must be carried out to demonstrate that results of adequate accuracy are being obtained. Only through such objective control can environmental variances or observed changes be assigned a degree of confidence. The Quality Assurance and Quality Control procedures should include all parts of the activities performed at the site, and in the laboratory.

8.1.2 Definitions

Quality Assurance (QA) is defined as "those operations and procedures which are undertaken to provide measurement data of stated quality with a stated probability of being right" (Taylor 1987).

Quality Control (QC) relates to the laboratory procedures used to reduce random errors and systematic errors, or maintain them within certain specified tolerable limits.

QA samples are used to assess data quality (as defined above) and for monitoring the internal QC procedures. QA samples are submitted blind to laboratories, i.e. their identity in the batch and their composition are unknown to the analyst. They are usually included in duplicate and randomly placed among the routine samples. QC samples are known to (created by) the laboratory and used to evaluate calibration and standardisation on instruments, problems of contamination or analytical interference.

Each NFP is required to submit a report on the QA/QC procedures followed by the laboratories involved. This should include the detection limits (DL) of the equipment used to analyse each substance. In the case of the Soil chemistry subprogramme, the DLs refer to the concentration of the substance in the extract/digestion.

Data Verification are procedures performed on the raw data that allow transcription errors to be identified and removed, and the checking of the completeness, precision and consistency of the data. Once the raw data has been verified, it can then be validated. Validation procedures include the identification of outlying data points and their assessment for inclusion or omission based on assigned levels of confidence. Internal consistency checks (ICC) are useful in identifying outliers and errors. ICCs are checks made on the routine sample results and consist of standard relationships, e.g. total S ≥ SO4-S, total N and organic C should show a strong positive correlation, etc..

8.1.3 Quality assurance steps in the IM Programme

A major problem with data derived from international monitoring programmes is their consistency through time and between countries. Most countries have preferred techniques which they use for the assessment of environmental parameters. These techniques often produce results that are not directly comparable, creating major problems for assessments of data derived from more than one country. In some cases, differences between methodologies invalidate the combination of data into international datasets. These problems have been identified by ICP Waters, who in 1996 undertook a major evaluation of the data held in their database, resulting in many data being discounted from future analyses.

In order to avoid these problems, the ICP Integrated Monitoring has implemented the following steps. These are based on ensuring the scientific quality of the results, even at the expense of rejecting some of the data submitted by National Focal Points.

  1. In certain critical cases, field methods have been clearly specified for specific parameters, and data obtained for these parameters using other field methods are unacceptable.
  2. Laboratories involved in chemical analyses of materials derived from IM sites should be certified under one of the laboratory accreditation systems, e.g. EN 45001 and ISO/IEC guide 25. Data from uncertified laboratories will be subject to more detailed scrutiny prior to acceptance.
  3. When a laboratory's practices deviate from the recommended analytical method, the laboratory is obliged to demonstrate that the methods produce values that are similar (± 10%) to the recommended method.
  4. All chemical parameters are the subject of ring tests.
  5. All laboratories are obliged to take part in ring tests under full identification. If a particular parameter lies outside an acceptable (± 10%) deviation, no data related to that parameter are included in the database. Only data from laboratories that have participated in ring tests are acceptable.
  6. ICP IM recognises the major difficulties with the assessment and/or interpretation of some biological response parameters, and in particular with the parameters associated with the Forest damage subprogramme. Consequently, where such uncertainties exist, it has reduced the emphasis given to these parameters in favour of more reliable indicators of biological response.
  7. Each NFP is expected to ensure that good laboratory practice is followed and is responsible for the quality of data reported to ICP IM Programme Centre.
  8. The results of quality controls, laboratory intercalibrations etc. undertaken (either with IM samples in particular or of the laboratory in general) should be reported to the IM Programme Centre. The Programme Centre also encourages participation in international intercalibration exercises.

8.2 Quality assurance routines in the field and in sampling

Traditionally, the greatest amount of attention in QA programmes is given to laboratory procedures. For the analysis of materials collected in the natural environment, such an emphasis is misplaced as the greatest sources of error are related to field sampling, transportation back to the laboratory, and sample preparation stages (Summers 1972). In the IM Programme, particular attention must be given to these stages.

Materials collected in the IM programme include water samples (from precipitation, throughfall and stemflow, soil water, groundwater, running water and lake water), plant materials, organic and inorganic soil materials, and entire organisms (e.g. benthic animals). Each of these requires separate methods and consequently separate QA protocols.

All methods used within a country should be documented, and change in methods over time noted. Standard operational procedures should be followed for all activities. Necessary equipment, cleaning materials, sufficient supply of spare parts etc. must be available. All operators should be well-trained and sites and equipment must be inspected/controlled at least once a year by the quality assurance manager/data originator. The QA/QC routines in the field include addition of field blanks and control samples, and also requirements for sample transportation and storage.

8.2.1 Collection and handling of water chemistry samples

Prevention of sample contamination or sample changes during collection or while in storage may be critical in obtaining accurate measurements. All containers used for either sample collection or storage must be free of any important quantity of the determinants in relation to the lowest concentration to be measured, and the containers must be of a material that will neither absorb nor release measurable quantities of the determinant.

All materials which come in contact with the sample must be chemically inert. Polyethylene, tetrafluoroethylene and tetrafluoroethylene-fluorinated ethyl-propylene copolymer are generally recommended because of their excellent chemical properties. The mechanical properties of these materials must be taken into account in the construction of samplers. Polyethylene may become brittle when exposed to sunlight, and should be replaced after 1 year of use involving exposure to sunlight. Borosilicate glass should be properly acid-washed and rinsed in deionized water prior to use, but the use of glass is not generally recommended. Soft glass will contaminate the sample with alkali and alkaline earth cations. Metals, and artificial materials with unknown chemical properties or composition should be avoided. If such materials have to be used in joints or in other constructional details of the sampling equipment, boil a sample of the material in deionized water and analyse the water afterwards as a water sample.

Rinse all bottles before use carefully with deionized water. All other sampling equipment must be leached in diluted acid for two to three days before being used, and stored in plastic bags. When analysing trace metals, samples must be collected and stored in acid-washed bottles. Extreme care must be exercised to avoid contamination and whenever possible sample containers must be full and tightly capped to minimise any interchange with entrapped air. Glass bottles are recommended for samples for carbon and mercury determinations. New glass containers should be initially washed in hot chromic acid solution. Later cleaning can be done using a mild detergent followed by rinsing in tap water and then finally in distilled or deionised water. Plastic bottles should be washed with concentrated hydrochloric acid or 50% nitric acid, or with a commercial decontaminant such as Decon (Cryer and Trudgill 1990).

Sample quantities
A general principle is that the larger the sample is, the lower will be effects of contamination from the sample bottle. For lake and river water samples, a sample size of 1 litre is normally collected. Precipitation and soil water samples are normally variable, being determined by availability.

Storage conditions between collections
It is important that bottles are kept away from light and kept cool during and after sampling. If the sample cannot be stored in a pithole, it should be covered, for example using aluminium foil.

Transportation to the laboratory
All laboratory bottles shall be clearly marked with plot number, collector number, sample type (eg. throughfall, stemflow) and sampling period. Sample identification and documentation of the sampling must be firmly and accurately maintained for every sample. This documentation is an integral part of the sample information and must be entered into the data base. Sample documentation should include as a minimum:

  • Sample site identification   
  • Date of sampling   
  • Sampling depth   
  • Additional notes (e.g. of suspected contamination)

Laboratory bottles should be transported to the laboratory as soon as possible, under warm weather conditions preferably in cold boxes.

Sample storage
Samples intended for major ion and nutrient analysis should be collected, stored in the dark at about 4°C and transferred to the laboratory for analysis as soon as possible. The transport and storage period between sampling and analysis should be kept at the minimum. Samples stored in polyethylene bottles for even a few hours are likely to lose some of their solutes (particularly phosphorus) as a result of adsorption onto the bottle walls.

Surface water samples (Runoff water, lake chemistry) intended for metal analysis may be preserved by adding acid, usually using nitric acid. Preservation at pH 2 will in most cases retain the total and dissolved metals for several weeks. If the preservative is added in the field, extreme care must be taken to prevent contamination of the major ion sample with nitric acid. If determining the dissolved fraction, it is necessary to filter the sample prior to preservation. Filters, when used, should have a 0.40 - 0.45 µm membrane (Whatman 42 or GFC) and be rinsed with deionized water prior to use. In general filtration is not necessary and if samples are filtered, this should be indicated when reporting the results.

Field blanks for water samples
In order to check on possible contamination on the site, field blank tests should be carried out a least once every month. For this purpose, 50-100 ml deionized water samples should be poured into the sample collector (where appropriate) after it has been washed/rinsed, or re-installed in the field. The samples should be subjected to the same procedure as an ordinary water sample.

8.3 Laboratory practices

8.3.1 In-laboratory quality control

All laboratories that participate in co-operative programmes should provide documented evidence that in-laboratory quality control is maintained to assure the accuracy and uniformity of routine laboratory analyses. Such documentation is routine for a certified laboratory. Unless in-laboratory quality control is carried out as normal laboratory operating practice, there is little benefit of between-laboratory quality control programmes.

In-laboratory quality control should include:

  1. Complete and thorough documentation of the methods of control; (for example: standard deviation of a single sample, use of control samples and in particular control charts).
  2. Documented evidence of analytical performance, accuracy of in-house standards, within-run precision, between-run controls and accuracy of the methods employed;
  3. Evidence of sample specific data quality such as an adequate ionic balance or specific conductance determination for individual samples;
  4. Evidence of adequate performance by analysis of external audit materials, standard samples of adequate matrix, etc.

8.3.2 Between-laboratory quality control

Between-laboratory quality control is necessary in a multi-laboratory programme to assure clear identification and control of the bias between the analyses carried out by individual participants of the programme. This quality assessment does not substitute for the routine in-laboratory control that assures consistency in day-to-day operations. Instead it is intended to assure that systematic biases do not exist between determinations of the different programme participants. Such biases may arise through the application of different methods, errors in laboratory standards or through inadequate in-laboratory control.

It is strongly recommended to participate annually in international inter-comparisons for all analysed compounds. It is also recommended to participate in field inter-comparisons. The IM Programme Centre will be able to give information about relevant inter-calibrations. All data should be verified and validated.

Between-laboratory quality control on water samples will be carried out by the ICP Waters programme. Quality control for plant and soil materials will be organised by the Forest Foliar Co-ordinating Centre (Vienna) and the Forest Soil Co-ordinating Centre (Gent), respectively.

8.3.3 Quality of measurements

The quality of the measurements should also be judged by the ion balance and by comparing calculated and measured conductivity. The target accuracy for the ion balance used also by ICP Waters programme is: the difference between the sum of cations and sum of anions should not exceed 10% of the cations. Organic anions can be approximated from TOC/DOC. The calculated conductivity will indicate if one or several analytical measurements are too low or too high.

8.3.4 Specific data quality control procedures

In some subprogrammes (e.g. Meteorology, Soil chemistry), data quality procedures are specific and have been described within the appropriate chapter. However, many quality control procedures are more general and are described here.

8.3.5 Water analysis

The laboratory must check on its performance, with respect to detection limits, precision and repeatability, by repeated analysis of control solutions etc. All data should be verified and validated.

The total error of individual analytical results should not exceed a value corresponding to the required detection limit (L), or a percentage of the result (P%), whichever is the greater. Laboratories using less sensitive methods should report deviations to the Programme Centre. Data Quality Objectives for EMEP are:

  • 10% accuracy or better for oxidized sulphur and oxidized nitrogen in single analysis in the laboratory   
  • 15% accuracy or better for other components in the laboratory   
  • 0.5 units for pH   
  • 15-25% uncertainty for the combined sampling and chemical analysis  
  • 90% data completeness of the daily values.

Suggested target accuracies (P%) and detection limits (L) for the measurement of water quality determinants:


Detection limit (L)



0.02 mg/l



0.01 mg/l



0.02 mg/l



0.02 mg/l



0.2 mg/l


Sulphate (as SO4)

0.2 mg/l


Nitrate (+ Nitrite)1, (as N)

10 µg/l


Reactive aluminium

10 µg/l


Non-labile (organic) aluminium

10 µg/l


Labile (inorganic) aluminium

10 µg/l


Dissolved organic carbon2, as C

0.2 mg/l



0.1 pH units



0.2 mS/m



0.005 mmol/l


Total phosphorus, as P

2 µg/l


Soluble reactive phosphate, (as P)

2 µg/l



±0.2 ºC


1) Depending on the method if nitrite is included. In well-aerated surface waters nitrite is usually close to zero

2) In samples with low particle content total organic carbon (TOC) may be used (no filtering).

The quality of the water chemistry data is strongly linked to the performance of the chemical laboratory. Control samples should be prepared, and analysed regularly as ordinary water samples, in order to keep an independent check on the chemical analyses performed. Standard rainwater samples are available from NIST and BCR, and it is advised to use such samples as an external reference solution analysed only 2-4 times during the year, and in-laboratory prepared control samples for daily control work. The control samples should approximate the expected mean concentration in the water samples, and may be prepared using the following compounds:

  • (NH4)2SO4
  • Nitric acid   
  • CaSO4 · 2H2O   
  • MgSO4 · 7H2O   
  • NaCl   
  • KCl Determination of accuracy and precision

In order to quantify the precision and accuracy and detection limit in the laboratory:

  • 5% of the samples should be split and the results used to quantify the analytical precision   
  • 5% of the samples should have known, and realistic, concentrations and should be run between the normal samples to control the performance of the analytical system  
  • 5% of the samples should be blank samples used to quantify the analytical detection limit.

The methods used to determine accuracy, precision and the detection limits from these data are provided in the EMEP Manual (Sections 5.6 and 5.7).

8.3.6 Soil analysis

The greatest degree of variation in the results obtained for the element contents in soils are likely to arise from the nature of the extractants used for the analyses. In this respect, it is important to note that the total content of a particular element may bear little relation to the content available to plants. There is no general agreement over the most suitable extractant for each element, and concientious laboratories will undertake repeat analyses using different extractants.

Data Plausibility
Detailed information on methods on checking data, QC and precision and accuracy, is regularly published by the American Public Health Association.

A simple plausibility check can be made to see if the sum of cations is balanced by the sum of anions. If there is a difference that can not be explained by any missing ions, then this should be brought to the attention of the laboratory. Other simple checks include looking at scatter plots between the parameters, e.g. SO4S - Total S concentrations and strongly correlated, PO4P - Total P concentrations and strongly correlated, NO3N + NH4N - total N concentrations, and Total inorganic N (NO3N + NH4N) strongly correlated to DOC. Careful screening for outliers can substantially reduce the variability of the data. If any of the metal concentrations determined with an ICP in simultaneous mode are outliers, then there is reason to check the chemistry of the whole sample.

QA Samples
QA samples should include: 1) field replicates (see sampling procedures), 2) preparation duplicates, i.e. after preparation of the routine samples for analysis (drying and sieving), duplicate subsamples are taken for chemical analysis and placed randomly within the batch, and 3) Natural audit samples, i.e. large amounts of typical soils that have been collected to be used as reference samples. These samples will be supplied by the ICP IM in cooperation with the Forest Soils Co-ordination Centre in Gent. The audit samples, randomly placed in the batch of routine samples to be analysed, can be used to evaluate within-batch precision and analytical differences among laboratories (accuracy). Usually these natural audit samples are submitted blind to the laboratory (accuracy can then be objectively assessed by the ICP IM) but it is recommended that some natural audit samples should not be blind to the laboratory and used in every batch. If the analytical results for these non-blind quality control audit samples are outside designated intervals, the batch must be reanalysed in order to bring the audit samples within tolerance specifications. This will ensure that each laboratory meets a rigid standard for each batch of samples analysed. Batch error and laboratory difference would thereby be reduced.

Besides calibration blanks (to check instrument drift), matrix spikes (to check on recovery), and analytical duplicates (subsamples of the extraction/digestion of a routine soil sample to check on within-batch precision and identifying instrumental drift), QC samples should include reagent blanks (sometimes referred to as a process blank) for those methods involving sample preparation, e.g., soil extraction. The reagent blank should be composed of all the reagents used and in the same quantities used in preparing the soil sample for analysis. The reagent blank should undergo the same digestion and extraction procedures as a routine sample and should be used to identify contamination by reagents. The ICP IM will supply extraction audit samples, and the results will be used to differentiate between systematic bias resulting from extraction and instrument sources of error. If these liquid audit samples are known to the laboratory, then the laboratory can check on these sources of error.

Verification of data

  • Blank concentrations should be less than DL
  • Relative standard deviation (RSD) of the audit pairs, field replicates, preparation duplicates, and analytical duplicates should be less than 10% (or other stated percentage) of the DL. (RSD is calculated by dividing the standard deviation of each pair by the mean, and then multiplying this value by 100)
  • Spike recovery should be within 15% (or other stated percentage) of the original spike concentration
  • Internal consistency checks (proposed standard analyte relationships):
  • sand+clay+silt=100   
    organic soils/samples ≥ 12% organic carbon   
    pH water>pHsalt and correlation of ≥ 0.95   
    PCEC>ECEC and correlation of ≥ 0.80   
    ACI_ET , AL_ET correlation of ≥ 0.80  
    ACI_ETB, C_TOT correlation ≥ 0.90   
    Exchangeable Ca>Mg>K>Na (units: meq/L)   
    total concentrationsextractable concentrations   
    N_TOT, C_TOT correlation ≥ 0.95   
    S_TOT, N_TOT correlation ≥ 0.75 (mineral soil samples only)
  • Identification of outliers: highest and lowest 1% of values, values >±3 Studentized residuals

8.3.7 Plant materials

The total element concentrations obtained by the laboratories' standard methods need to be checked in order to determine the accuracy of these methods. Two steps of quality assurance are recommended:

Comparison of the results of the national methods with the concentrations of reference standard samples. These reference standard samples, with certified total element concentrations supplied e.g. by the Central Bureau of References of the EC or by ISO (International Standard Organization), or by the US group of foliar analysis, will be sent to participating laboratories for analysis. The certified concentrations will be supplied to individual laboratories once a sufficient number of laboratories have submitted their results.

In order to permanently check the accuracy of the analyses, it is also recommended that each laboratory provides several of its own standard samples for analysis in each batch of samples. The data should only be accepted if the analyses of the known samples match the reference results.

8.4 Audits

Performance audits should be carried out by representatives of the technical staff from the institution operating the site once each year to see that the field operations work as intended. System audits should be carried out by the IM QA Manager in cooperation with the designated National Focal Point QA Managers at regular intervals.

A detailed check-list to be filled in during these inspections should be worked out, and the WMO GAW check-list (WMO 1994) may be used during audits of the wet deposition part of the measurements. The filled-in forms should be assessed by a scientist to ensure that the all aspects of the field programme operate as intended. The auditors should bring with them copies of the filled-in forms from the last visit when performing a site inspection. Corrective action should be taken immediately when necessary.

The system audits should include:

  • Check the quality system in general   
  • Inspect the sample locations and the site surroundings, and any changes since the last visit should be noted   
  • Follow the staff during their routines, and correct bad handling of equipment   
  • Check and calibrate the equipment and instruments   
  • Inspect the field journals   
  • Evaluate the need for improvements

An audit plan and guidelines for the audit should be worked out for this purpose.

8.5 Analytical techniques

The use of adequate methods is the responsibility of the national institutes. The majority of the participating countries have accepted the use of international standard methods such as prescribed by ISO/CEN in their national work. The EN (European standards) are legally prescribed for use by all EU nations. ICP IM should also adopt ISO/CEN standard methods as a basis for the methods actually used, as has been done in the ICP Waters programme. The ISO/CEN methods usually have a high quality, are well verified and documented in a way accessible to the participants. Being aware that changing methods are often difficult, expensive and not necessarily desirable, it should at least be documented that the methods used have a quality equal to or better than the ISO/EN standard with respect to interferences and detection levels. The main pre-treatment method and determination codes (available in DB codelist) should be included in the data delivered to the Programme Centre.

Information of the ISO/CEN methods listed in available standards can be obtained from :

  1. The national standardisation agencies.
  2. International Organisation for Standardisation, DIN, Burggrafenstrasse 6, 10787 Berlin,Germany.
  3. ISO International Organisation for Standardisation, Case Postale 56, CH-1211 Genève, Switzerland.
  4. CEN European Committee for Standardisation, rue de Stassart 36, B-1050 Brussels, Belgium.

8.6 References and further reading

Allen, S.E. (1974) Chemical analysis of ecological materials. Blackwell Scientific, Oxford.

Cryer, R. And Trudgill, S.T. 1990. Solutes. In Goudie, A., Anderson, M., Burt, T., Lewin, J., Richards, K., Whalley, B. And Worsley, P. (eds) Geomorphological Techniques. Unwin Hyman, London, pp. 260-279.

EMEP. EMEP manual for sampling and chemical analysis, EMEP/CCC-Report 1/95, NILU, Kjeller, Norway, March 1996.

Hem, J.D. 1970. Study and interpretation of the chemical characteristics of natural water. U.S. Geological Survey Water Supply Paper N. 1473, 2nd edition.

ICP Waters. ICP Waters Programme manual. Compiled by the Programme Centre, Norwegian Institute for Water Research. Revised edition, Oslo, September 1996.

Jones, J.B. 1988. Comments on the accuracy of analytical data in the published scientific literature. Soil Science Society of America Journal 52, 1203-1204.

Kalra, Y.P. and Maynard, D.G. 1991. Methods manual for forest soil and plant analysis. Information Report NOR-X-319. Forestry Canada, Northwest Region, Northern Forestry Centre, Edmonton. 116 pp.

Lindberg, S.E., Turner, R.R., Ferguson, N.M. and Matt, D. 1977. Walker Branch watershed element cycling studies: collection and analysis of wetfall for trace elements and sulphate. In: Correll, D.L. (ed.) Watershed research in eastern North America. Volume 1. Smithsonian Institute, Edgewater, 125-150.

Reynolds, B. 1981. Methods for the collection and analysis of water samples for a geochemical cycling study. Institute of Terrestrial Ecology, Bangor, Occasional Paper No. 5.

Summers, W.K. 1972. Factors affecting the validity of chemical analyses of natural waters. Groundwater 10, 12-17.

Taylor JK.1987. Quality Assurance of Chemical Measurements. Lewis Publishers, Chelsea Michigan, 328 pp.

US-EPA 1988. Direct/Delayed Response Project: Quality Assurance Report for Physical and Chemical Analyses of Soils from the Southern Blue Ridge Province of the United States. EPA/600/PS8-86/100. September 1988.

WMO (1994) Report of the workshop on precipitation chemistry laboratory techniques. Hradec Kralove, Czech Republic, 18-21 October 1994. Edited by V. Mohnen, J. Santroch, and R. Vet. Geneva (WMO/GAW No. 102).

Published 2013-06-10 at 13:04, updated 2013-06-10 at 15:03
Target group: