Beamex_Book - Ultimate Calibration 2nd Edition

January 11, 2017 | Author: 14071988 | Category: N/A
Share Embed Donate


Short Description

Download Beamex_Book - Ultimate Calibration 2nd Edition...

Description

Ultimate Calibration 2nd Edition

Beamex is a technology and service company that develops, manufactures and markets high-quality calibration equipment, software, systems and services for the calibration and maintenance of process instruments. The company is a leading worldwide provider of integrated calibration solutions that meet even the most demanding requirements. Beamex offers a comprehensive range of products and services-from portable calibrators to workstations, calibration accessories, calibration software, industry-specific solutions and professional services. Through Beamex’s global and competent partner network, their products and services are available in more than 60 countries. As a proof of Beamex’s success, there are more than 10,000 companies worldwide utilizing their calibration solutions. Several companies have been Beamex’s customer since the establishment of the company over 30 years ago. For more information about Beamex and its products and services, visit www.beamex.com

Beamex has used reasonable efforts to ensure that this book contains both accurate and comprehensive information. Notwithstanding the foregoing, the content of this book is provided “as is” without any representations, warranties or guarantees of any kind, whether express or implied, in relation to the accuracy, completeness, adequacy, currency, quality, timeliness or fitness for a particular purpose of the content and information provided on this book. The contents of this book are for general informational purposes only. Furthermore, this book provides examples of some of the laws, regulations and standards related to calibration and is not intended to be definitive. It is the responsibility of a company to determine which laws, regulations and standards apply in specific circumstances.

Ultimate Calibration 2nd Edition Copyright © 2009–2012 by Beamex Oy Ab. All rights reserved. No part of this publication may be reproduced or distributed in any form or by any means, or stored in a database or retrieval system, without the prior written permission of Beamex Oy Ab. Requests should be directed to [email protected]. Beamex is a trademark of Beamex Oy Ab. All other trademarks or trade names mentioned in this book are the property of their respective holders.

Graphic design: Studio PAP Photos: Mats Sandström and image bank Printed by: Fram in Vaasa 2012, Finland

Contents Preface by the CEO of Beamex Group  7 QUALITY, REGULATIONS AND TRACEABILITY Quality standards and industry regulations  11 A basic quality calibration program  35 Traceable and efficient calibrations in the process industry  57 CALIBRATION MANAGEMENT AND MAINTENANCE Why Calibrate? What is the risk of not calibrating?  73 Why use software for calibration management?  79 How often should instruments be calibrated?  89 How often should calibrators be calibrated?  97 Paperless calibration improves quality and cuts costs  101 Intelligent commissioning  107 Successfully executing a system integration project  115 CALIBRATION IN INDUSTRIAL APPLICATIONS The benefits of using a documenting calibrator  125 Calibration of weighing instruments Part 1  131 Calibration of weighing instruments Part 2  137 Calibrating temperature instruments  143 Calculating total uncertainty of temperature calibration with a dry block  149 Fieldbus transmitters must also be calibrated  157 Configuring and calibrating smart instruments  163 Calibration in hazardous environments  169 The safest way to calibrate to calibrate Fieldbus instruments  175 APPENDIX: Calibration terminology A to Z  181

foreword

6

preface by the ceo of beamex group

Preface

C

alibrators, calibration software and other related equipment have developed significantly during the past few decades in spite of the fact that calibration of measurement devices as such has existed for several thousands of years. Presently, the primary challenges of industrial metrology and calibration include how to simplify and streamline the entire calibration process, how to eliminate double work, how to reduce production down-time, and how to lower the risk of human errors. All of these challenges can be tackled by improving the level of system integration and automation. Calibration and calibrators can no longer be considered as isolated, stand-alone devices, systems or work processes within a company or production plant. Just like any other business function, calibration procedures need to be automated to a higher degree and integrated to achieve improvements in quality and efficiency. In this area, Beamex aims to be the benchmark in the industry. This book is the 2nd edition of Ultimate Calibration. The main changes to this edition include numerous new articles and a new grouping of the articles to make it easier to find related topics. The new topics covered in the edition mainly discuss paperless calibration, intelligent commissioning, temperature calibration and configuring, and calibration of smart instruments. This book is the result of work that has taken place between 2006 and 2012. A team of experts in industry and calibration worldwide has put forth effort to its creation. On behalf of Beamex, I would like to thank all of the people who have contributed to this book. I want to express my special thanks to Pamela at Beamex Marketing, who was the key person in organizing and leading the project for the 2nd edition. I hope this book will assist you in learning new things and in providing fresh, new ideas. Enjoy your reading! raimo ahola, ceo, beamex group

7

Quality, Regulations and Traceability

quality standards and industry regulations

Calibration requirements according to quality standards and industry regulations

B

efore going into what the current standards and regulations actually state, here is a reminder from times past about measurement practices and how important they really are. Immersion in water makes the straight seem bent; but reason, thus confused by false appearance, is beautifully restored by measuring, numbering and weighing; these drive vague notions of greater or less or more or heavier right out of the minds of the surveyor, the computer, and the clerk of the scales. Surely it is the better part of thought that relies on measurement and calculation. (Plato, The Republic, 360 B.C.) There shall be standard measures of wine, beer, and corn… throughout the whole of our kingdom, and a standard width of dyed russet and cloth; and there shall be standard weights also. (Clause 35, Magna Carta, 1215) When you can measure what you are speaking about and express it in numbers, you know something about it; but when you cannot express it in numbers, your knowledge is of a meager and unsatisfactory kind. It may be the beginning of knowledge, but you have scarcely, in your thoughts, advanced to the stage of science. (William Thomson, 1st Baron Kelvin, GCVO, OM, PC, PRS, 26 June 1824–17 December 1907; A.K.A. Lord Kelvin).1 One of the earliest records of precise measurement is from Egypt. The Egyptians studied the science of geometry to assist them in the construction of the Pyramids. It is believed that about 3000 years B.C., the Egyptian unit of length came into being. The “Royal Egyptian Cubit” was decreed to be equal to the length of the forearm from the bent elbow to the tip of the extended middle

11

quality standards and industry regulations

finger plus the width of the palm of the hand of the Pharaoh or King ruling at that time.2 The “Royal Cubit Master” was carved out of a block of granite to endure for all times. Workers engaged in building tombs, temples, pyramids, etc. were supplied with cubits made of wood or granite. The Royal Architect or Foreman of the construction site was responsible for maintaining & transferring the unit of length to workers instruments. They were required to bring back their cubit sticks at each full moon to be compared to the Royal Cubit Master. Failure to do so was punishable by death. Though the punishment prescribed was severe, the Egyptians had anticipated the spirit of the present day system of legal metrology, standards, traceability and calibration recall. With this standardization and uniformity of length, the Egyptians achieved surprising accuracy. Thousands of workers were engaged in building the Great Pyramid of Giza. Through the use of cubit sticks, they achieved an accuracy of 0.05%. In roughly 756 feet or 230.36276 meters, they were within 4.5 inches or 11.43 centimeters. The need for calibration has been around for at least 5000 years. In today’s calibration environment, there are basically two types of requirements: ISO standards and regulatory requirements. The biggest difference between the two is simple – ISO standards are voluntary, and regulatory requirements are mandatory. If an organization volunteers to meet ISO 9000 standards, they pay a company to audit them to that standard to ensure they are following their quality manual and are within compliance. On the other hand, if a company is manufacturing a drug that must meet regulatory requirements, they are inspected by government inspectors for compliance to federal regulations. In the case of ISO standards, a set of guidelines are used to write their quality manual and other standard operating procedures (SOPs) and they show how they comply with the standard. However, the federal regulations specify in greater detail what a company must do to meet the requirements set forth in the Code of Federal Regulations (CFRs). In Europe, detailed information for achieving regulatory compliance is provided in Eudralex - Volume 4 of “The rules governing medicinal products in the European Union”. The Pharmaceutical Inspection Convention and Pharmaceutical Inspection Co-operation Scheme (PIC/S) aims to improve harmonisation of Good Manufacturing Practice (GMP) standards and guidance documents.

12

quality standards and industry regulations

Calibration requirements according to the U. S. Food and Drug Administration (FDA) Following are examples of some of the regulations required by the FDA, and what they say about calibration and what must be accomplished to meet the CFRs. Please note that European standards are similar to FDA requirements. Listed below are several different parts of 21CFR, that relate to the calibration of test equipment in different situations and environments. TITLE 21 – FOOD AND DRUGS CHAPTER I – FOOD AND DRUG ADMINISTRATION DEPARTMENT OF HEALTH AND HUMAN SERVICES SUBCHAPTER H – MEDICAL DEVICES PART 820 QUALITY SYSTEM REGULATION 22 Subpart A – General Provisions § 820.1 – Scope. § 820.3 – Definitions. § 820.5 – Quality system. Subpart B – Quality System Requirements § 820.20 – Management responsibility. § 820.22 – Quality audit. § 820.25 – Personnel. Subpart C – Design Controls § 820.30 – Design controls. Subpart D – Document Controls § 820.40 – Document controls. Subpart E – Purchasing Controls § 820.50 – Purchasing controls. Subpart F – Identification and Traceability § 820.60 – Identification. § 820.65 – Traceability. Subpart G – Production and Process Controls § 820.70 – Production and process controls. § 820.72 – Inspection, measuring, and test equipment. § 820.75 – Process validation.

13

quality standards and industry regulations

Subpart H – Acceptance Activities § 820.80 – Receiving, in-process, and finished device acceptance. § 820.86 – Acceptance status. Subpart I – Nonconforming Product § 820.90 – Nonconforming product. Subpart J – Corrective and Preventive Action § 820.100 – Corrective and preventive action. Subpart K – Labeling and Packaging Control § 820.120 – Device labeling. § 820.130 – Device packaging. Subpart L – Handling, Storage, Distribution, and Installation § 820.140 – Handling. § 820.150 – Storage. § 820.160 – Distribution. § 820.170 – Installation. Subpart M – Records § 820.180 – General requirements. § 820.181 – Device master record. § 820.184 – Device history record. § 820.186 – Quality system record. § 820.198 – Complaint files. Subpart N – Servicing § 820.200 – Servicing. Subpart O – Statistical Techniques § 820.250 – Statistical techniques.

[Code of Federal Regulations] [Title 21, Volume 8] [Revised as of April 1, 2012] [CITE: 21CFR820.72] TITLE 2 – FOOD AND DRUGS CHAPTER I – FOOD AND DRUG ADMINISTRATION DEPARTMENT OF HEALTH AND HUMAN SERVICES SUBCHAPTER H – MEDICAL DEVICES

14

quality standards and industry regulations

PART 820–QUALITY SYSTEM REGULATION Subpart G–Production and Process Controls Sec. 820.72 Inspection, measuring, and test equipment. (a) Control of inspection, measuring, and test equipment. Each manufacturer shall ensure that all inspection, measuring, and test equipment, including mechanical, automated, or electronic inspection and test equipment, is suitable for its intended purposes and is capable of producing valid results. Each manufacturer shall establish and maintain procedures to ensure that equipment is routinely calibrated, inspected, checked, and maintained. The procedures shall include provisions for handling, preservation, and storage of equipment, so that its accuracy and fitness for use are maintained. These activities shall be documented. (b)  Calibration. Calibration procedures shall include specific directions and limits for accuracy and precision. When accuracy and precision limits are not met, there shall be provisions for remedial action to reestablish the limits and to evaluate whether there was any adverse effect on the device’s quality. These activities shall be documented. (1)  C alibration standards. Calibration standards used for inspection, measuring, and test equipment shall be traceable to national or international standards. If national or international standards are not practical or available, the manufacturer shall use an independent reproducible standard. If no applicable standard exists, the manufacturer shall establish and maintain an in-house standard.

(2)  Calibration records. The equipment identification, calibration dates, the individual performing each calibration, and the next calibration date shall be documented. These records shall be displayed on or near each piece of equipment or shall be readily available to the personnel using such equipment and to the individuals responsible for calibrating the equipment.

15

quality standards and industry regulations

[Code of Federal Regulations] [Title 21, Volume 4] [Revised as of April 1, 2012] [CITE: 21CFR211] TITLE 21 – FOOD AND DRUGS CHAPTER I – FOOD AND DRUG ADMINISTRATION DEPARTMENT OF HEALTH AND HUMAN SERVICES SUBCHAPTER C – DRUGS: GENERAL PART 211 CURRENT GOOD MANUFACTURING PRACTICE FOR FINISHED PHARMACEUTICALS Subpart D – Equipment Sec. 211.68 Automatic, mechanical, and electronic equipment. (a) Automatic, mechanical, or electronic equipment or other types of equipment, including computers, or related systems that will perform a function satisfactorily, may be used in the manufacture, processing, packing, and holding of a drug product. If such equipment is so used, it shall be routinely calibrated, inspected, or checked according to a written program designed to assure proper performance. Written records of those calibration checks and inspections shall be maintained.

Sec. 211.160 General requirements. (b) Laboratory controls shall include the establishment of scientifically sound and appropriate specifications, standards, sampling plans, and test procedures designed to assure that components, drug product containers, closures, in-process materials, labeling, and drug products conform to appropriate standards of identity, strength, quality, and purity. Laboratory controls shall include: (1) Determination of conformity to applicable written specifications for the acceptance of each lot within each shipment of components, drug product containers, closures,

16

quality standards and industry regulations

and labeling used in the manufacture, processing, packing, or holding of drug products. The specifications shall include a description of the sampling and testing procedures used. Samples shall be representative and adequately identified. Such procedures shall also require appropriate retesting of any component, drug product container, or closure that is subject to deterioration.

(2) Determination of conformance to written specifications and a description of sampling and testing procedures for in-process materials. Such samples shall be representative and properly identified.



(3) Determination of conformance to written descriptions of sampling procedures and appropriate specifications for drug products. Such samples shall be representative and properly identified.



(4) The calibration of instruments, apparatus, gauges, and recording devices at suitable intervals in accordance with an established written program containing specific directions, schedules, limits for accuracy and precision, and provisions for remedial action in the event accuracy and/or precision limits are not met. Instruments, apparatus, gauges, and recording devices not meeting established specifications shall not be used.

[43 FR 45077, Sept. 29, 1978, as amended at 73 FR 51932, Sept. 8, 2008] Sec. 211.194 Laboratory records. (d) Complete records shall be maintained of the periodic calibration of laboratory instruments, apparatus, gauges, and recording devices required by 211.160(b)(4).

17

quality standards and industry regulations

TITLE 2 – FOOD AND DRUGS CHAPTER I – FOOD AND DRUG ADMINISTRATION DEPARTMENT OF HEALTH AND HUMAN SERVICES SUBCHAPTER A – GENERAL PART 11 ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Subpart A – General Provisions Sec. 11.1 Scope. (a) The regulations in this part set forth the criteria under which the agency considers electronic records, electronic signatures, and handwritten signatures executed to electronic records to be trustworthy, reliable, and generally equivalent to paper records and handwritten signatures executed on paper. (b) This part applies to records in electronic form that are created, modified, maintained, archived, retrieved, or transmitted, under any records requirements set forth in agency regulations. This part also applies to electronic records submitted to the agency under requirements of the Federal Food, Drug, and Cosmetic Act and the Public Health Service Act, even if such records are not specifically identified in agency regulations. However, this part does not apply to paper records that are, or have been, transmitted by electronic means. (c) Where electronic signatures and their associated electronic records meet the requirements of this part, the agency will consider the electronic signatures to be equivalent to full handwritten signatures, initials, and other general signings as required by agency regulations, unless specifically excepted by regulation(s) effective on or after August 20, 1997. (d) Electronic records that meet the requirements of this part may be used in lieu of paper records, in accordance with 11.2, unless paper records are specifically required.

18

quality standards and industry regulations

(e) Computer systems (including hardware and software), controls, and attendant documentation maintained under this part shall be readily available for, and subject to, FDA inspection. (f) This part does not apply to records required to be established or maintained by 1.326 through 1.368 of this chapter. Records that satisfy the requirements of part 1, subpart J of this chapter, but that also are required under other applicable statutory provisions or regulations, remain subject to this part. [62 FR 13464, Mar. 20, 1997, as amended at 69 FR 71655, Dec. 9, 2004] Sec. 11.2 Implementation. (a) For records required to be maintained but not submitted to the agency, persons may use electronic records in lieu of paper records or electronic signatures in lieu of traditional signatures, in whole or in part, provided that the requirements of this part are met. (b) For records submitted to the agency, persons may use electronic records in lieu of paper records or electronic signatures in lieu of traditional signatures, in whole or in part, provided that:

(1) The requirements of this part are met; and



(2) The document or parts of a document to be submitted have been identified in public docket No. 92S-0251 as being the type of submission the agency accepts in electronic form. This docket will identify specifically what types of documents or parts of documents are acceptable for submission in electronic form without paper records and the agency receiving unit(s) (e.g., specific center, office, division, branch) to which such submissions may be made. Documents to agency receiving unit(s) not specified in the public docket will not be considered as official if they are submitted in electronic form; paper forms of such documents will be considered as official and must accompany any electronic records. Persons are expected to consult with the intended agency receiving unit for details on how (e.g., method of transmission, media, file formats, and technical protocols) and whether to proceed with the electronic submission. 19

quality standards and industry regulations

TITLE 21--FOOD AND DRUGS CHAPTER I--FOOD AND DRUG ADMINISTRATION DEPARTMENT OF HEALTH AND HUMAN SERVICES SUBCHAPTER A--GENERAL PART 11 ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Subpart C – Electronic Signatures Sec. 11.100 General requirements. (a) Each electronic signature shall be unique to one individual and shall not be reused by, or reassigned to, anyone else. (b) Before an organization establishes, assigns, certifies, or otherwise sanctions an individual`s electronic signature, or any element of such electronic signature, the organization shall verify the identity of the individual. (c) Persons using electronic signatures shall, prior to or at the time of such use, certify to the agency that the electronic signatures in their system, used on or after August 20, 1997, are intended to be the legally binding equivalent of traditional handwritten signatures.

20



(1) The certification shall be submitted in paper form and signed with a traditional handwritten signature, to the Office of Regional Operations (HFC-100), 12420 Parklawn Drive, RM 3007 Rockville, MD 20857.



(2) Persons using electronic signatures shall, upon agency request, provide additional certification or testimony that a specific electronic signature is the legally binding equivalent of the signer`s handwritten signature.

quality standards and industry regulations

Calibration requirements according to the European Medicines Agency (EMA) Following are examples of some of the regulatory requirements of the EMA, and what they say about calibration and what must be accomplished to meet the GMPs. Eudralex Volume 4 Chapter 3: Premises and Equipment Equipment 3.41 Measuring, weighing, recording and control equipment should be calibrated and checked at defined intervals by appropriate methods. Adequate records of such tests should be maintained. Chapter 4: Documentation Manufacturing Formula and Processing Instructions Approved, written Manufacturing Formula and Processing Instructions should exist for each product and batch size to be manufactured. 4.18 The Processing Instructions should include: a) A statement of the processing location and the principal equipment to be used; b) The methods, or reference to the methods, to be used for preparing the critical equipment (e.g. cleaning, assembling, calibrating, sterilising); c) Checks that the equipment and work station are clear of previous products, documents or materials not required for the planned process, and that equipment is clean and suitable for use; d) Detailed stepwise processing instructions [e.g. checks on materials, pre-treatments, sequence for adding materials, critical process parameters (time, temp etc)]; e) The instructions for any in-process controls with their limits; f ) Where necessary, the requirements for bulk storage of the products; including the container, labeling and special storage conditions where applicable; g) Any special precautions to be observed.

21

quality standards and industry regulations

Procedures and records Other 4.29 There should be written policies, procedures, protocols, reports and the associated records of actions taken or conclusions reached, where appropriate, for the following examples: • Validation and qualification of processes, equipment and systems; • Equipment assembly and calibration; • Technology transfer; • Maintenance, cleaning and sanitation; • Personnel matters including signature lists, training in GMP and technical matters, clothing and hygiene and verification of the effectiveness of training. • Environmental monitoring; • Pest control; • Complaints; • Recalls; • Returns; • Change control; • Investigations into deviations and non-conformances; • Internal quality/GMP compliance audits; • Summaries of records where appropriate (e.g. product quality review); • Supplier audits. 4.31 Logbooks should be kept for major or critical analytical testing, production equipment, and areas where product has been processed. They should be used to record in chronological order, as appropriate, any use of the area, equipment/method, calibrations, maintenance, cleaning or repair operations, including the dates and identity of people who carried these operations out.

22

quality standards and industry regulations

Chapter 6 Quality Control Good Quality Control Laboratory Practice Documentation 6.7 Laboratory documentation should follow the principles given in Chapter 4. An important part of this documentation deals with Quality Control and the following details should be readily available to the Quality Control Department: • specifications; • sampling procedures; • testing procedures and records (including analytical worksheets and/ or laboratory notebooks); • analytical reports and/or certificates; • data from environmental monitoring, where required; • validation records of test methods, where applicable; • procedures for and records of the calibration of instruments and maintenance of equipment. Annex 15 to the EU Guide to Good Manufacturing Practice Title: Qualification and validation QUALIFICATION Installation qualification 11. Installation qualification (IQ ) should be performed on new or modified facilities, systems and equipment. 12. IQ should include, but not be limited to the following: (a) installation of equipment, piping, services and instrumentation checked to current engineering drawings and specifications; (b) icollection and collation of supplier operating and working instructions and maintenance requirements; (c) icalibration requirements; (d) verification of materials of construction.

23

quality standards and industry regulations

Operational qualification 15. The completion of a successful Operational qualification should allow the finalisation of calibration, operating and cleaning procedures, operator training and preventative maintenance requirements. It should permit a formal “release” of the facilities, systems and equipment. Qualification of established (in-use) facilities, systems and equipment 19. Evidence should be available to support and verify the operating parameters and limits for the critical variables of the operating equipment. Additionally, the calibration, cleaning, preventative maintenance, operating procedures and operator training procedures and records should be documented. PROCESS VALIDATION Prospective validation 24. Prospective validation should include, but not be limited to the following: (a) short description of the process; (b) summary of the critical processing steps to be investigated; (c) list of the equipment/facilities to be used (including measuring/ monitoring/recording equipment) together with its calibration status (d) finished product specifications for release; (e) list of analytical methods, as appropriate; (f) proposed in-process controls with acceptance criteria; (g) additional testing to be carried out, with acceptance criteria and analytical validation, as appropriate; (h) sampling plan; (i) methods for recording and evaluating results (j) functions and responsibilities; (k) proposed timetable.

24

quality standards and industry regulations

EU GMP Annex 11 The EU GMP Annex 11 defines EU requirements for computerised systems, and applies to all forms of computerised systems used as part of GMP regulated activities. Main page for the EudraLex - Volume 4 Good manufacturing practice (GMP) Guidelines: http://ec.europa.eu/health/documents/eudralex/vol-4/index_en.htm

PDF of Annex 11: http://ec.europa.eu/health/files/eudralex/vol-4/annex11_01-2011_ en.pdf

EUROPEAN COMMISSION HEALTH AND CONSUMERS DIRECTORATE-GENERAL Public Health and Risk Assessment Pharmaceuticals Brussels, SANCO/C8/AM/sl/ares(2010)1064599 EudraLex The Rules Governing Medicinal Products in the European Union Volume 4 Good Manufacturing Practice Medicinal Products for Human and Veterinary Use Annex 11: Computerised Systems Legal basis for publishing the detailed guidelines: Article 47 of Directive 2001/83/EC on the Community code relating to medicinal products for human use and Article 51 of Directive 2001/82/EC on the Community code relating to veterinary medicinal products. This document provides guidance for the interpretation of the principles

25

quality standards and industry regulations

and guidelines of good manufacturing practice (GMP) for medicinal products as laid down in Directive 2003/94/EC for medicinal products for human use and Directive 91/412/EEC for veterinary use. Status of the document: revision 1 Reasons for changes: the Annex has been revised in response to the increased use of computerised systems and the increased complexity of these systems. Consequential amendments are also proposed for Chapter 4 of the GMP Guide. Deadline for coming into operation: 30 June 2011 Commission Européenne, B-1049 Bruxelles / Europese Commissie, B-1049 Brussel - Belgium Telephone: (32-2) 299 11 11 Principle This annex applies to all forms of computerised systems used as part of a GMP regulated activities. A computerised system is a set of software and hardware components which together fulfill certain functionalities. The application should be validated; IT infrastructure should be qualified. Where a computerised system replaces a manual operation, there should be no resultant decrease in product quality, process control or quality assurance. There should be no increase in the overall risk of the process. PIC/S The abbreviation PIC/S describes both the Pharmaceutical Inspection Convention (PIC) and the Pharmaceutical Inspection Co-operation Scheme (PIC Scheme) which operate together. It aims to promote harmonisation of global regulations for the pharmaceutical industry. Further information can be found at the PIC/S Web site (http://www. picscheme.org/.).

26

quality standards and industry regulations

GAMP® GAMP® is a Community of Practice (COP) of the International Society for Pharmaceutical Engineering (ISPE). The GAMP® COP aims to provide guidance and understanding concerning GxP computerized systems. COPs provide networking opportunities for people interested in similar topics. The GAMP® COP organizes discussion forums for its members and ISPE organises GAMP® related training courses and educational seminars. GAMP® itself was founded in 1991 in the United Kingdom to deal with the evolving FDA expectations for Good Manufacturing Practice (GMP) compliance of manufacturing and related systems. Since 1994, the organization entered into a partnership with the ISPE and published its first GAMP® guidelines. Three regional Steering Committees, GAMP® Japan, GAMP® Europe, and GAMP® Americas support the GAMP® Council which oversee the operation of the COP and is the main link to ISPE. Several local GAMP® COPs, such as GAMP® Americas, GAMP® Nordic, GAMP® DACH (Germany, Austria, Switzerland), GAMP® Francophone, GAMP® Italiano and GAMP® Japan, produce technical content and translate ISPE technical documents. They also bring the GAMP® community closer to its members, in collaboration with ISPE’s local Affiliates in these regions. The most well known GAMP® publication is GAMP ® 5 A RiskBased Approach to GxP Computerized Systems. This is the latest major revision and was released in January 2008. There is also a series of related GAMP® guidance on specific topics, including: • GAMP ® Good Practice Guide: A Risk-Based Approach to Calibration Management (Second Edition) • GAMP® Good Practice Guide: A Risk-Based Approach to GxP Compliant Laboratory Computerized Systems (Second Edition) • GAMP® Good Practice Guide: A Risk-Based Approach to GxP Process Control Systems (Second Edition) • GAMP® Good Practice Guide: A Risk-Based Approach to Operation of GxP Computerized Systems - A Companion Volume to GAMP® 5 • GAMP® Good Practice Guide: Electronic Data Archiving • GAMP® Good Practice Guide: Global Information Systems Control and Compliance

27

quality standards and industry regulations

• GAMP® Good Practice Guide: IT Infrastructure Control and Compliance • GAMP® Good Practice Guide: Legacy Systems The GAMP® Good Practice Guide: A Risk-Based Approach to Calibration Management (second edition) was developed by ISPE’s GAMP® COP Calibration Special Interest Group (SIG) in conjunction with representatives from the pharmaceutical industry and input from regulatory agencies. The Guide describes the principles of calibration and presents guidance in setting up a calibration management system, providing a structured approach to instrument risk assessment, calibration program management, documentation, and corrective actions vital to regulatory compliance. The second edition of the guide has been significantly updated to address the change in regulatory expectations and in associated industry guidance documents. The scope now includes related industries, laboratory, and analytical instrumentation. A set of associated attachments are also available through the ISPE website. ISO 9001:2008 Basically, this is what is required according to ISO 9001:2008 7.6 CONTROL MONITORING AND MEASURING EQUIPMENT • Identify your organization’s monitoring and measuring needs and requirements (if your test instrument makes a quantitative measurement, it requires periodic calibration); and select test equipment that can meet those monitoring and measuring needs and requirements.  • Establish monitoring and measuring processes (calibration procedures and calibration record templates for recording your calibration results).  • Calibrate your monitoring and measuring equipment using a period schedule to ensure that results are valid (you should also perform a yearly evaluation of your calibration results to see if there is a need to increase or decrease your calibration intervals on calibrated test equipment). All calibrations must be traceable to a national or international standard or artifact. 

28

quality standards and industry regulations

• Protect your monitoring and measuring equipment (this includes during handling, preservation, storage, transportation, and shipping of all test instruments – to include your customer’s items, and your calibration standards).  • Confirm that monitoring and measuring software is capable of doing the job you want it to do (your software needs to be validated before being used, and when required, your test instruments may need to be qualified prior to use).  • Evaluate the validity of previous measurements whenever you discover that your measuring or monitoring equipment is out-ofcalibration (as stated in the FDA regulations, “When accuracy and precision limits are not met, there shall be provisions for remedial action to reestablish the limits and to evaluate whether there was any adverse effect on the device’s quality”; this is just as applicable when dealing with ISO as with any other standard or regulation; especially when the out of tolerance item is a calibration standard, and may have affected numerous items of test equipment over a period of time). ISO 17025 ISO 17025 – General requirements for the competence of testing and calibration laboratories. According to ISO 17025, this standard is applicable to all organizations performing tests and/or calibrations. These include first-, second-, and third-party laboratories, and laboratories where testing and/or calibration forms part of inspection and product certifications. Please keep in mind that if your calibration function and/or metrology department fall under the requirements of your company, rather it be for compliance to an ISO standard (ISO 9001:2008 or ISO 13485) or an FDA requirement (cGMP, QSR, etc.), then you do not have any obligation to meet the ISO 17025 standard. You already fall under a quality system that takes care of your calibration requirements. ANSI/NCSL Z540.3-2006 ANSI/NCSL Z540.3-2006 – American National Standard for Calibration-Requirements for the Calibration of Measuring and Test Equipment.

29

quality standards and industry regulations

The objective of this National Standard is to establish the technical requirements for the calibration of measuring and test equipment. This is done through the use of a system of functional components. Collectively, these components are used to manage and assure that the accuracy and reliability of the measuring and test equipment are in accordance with identified performance requirements. In implementing its objective, this National Standard describes the technical requirements for establishing and maintaining: • the acceptability of the performance of measuring and test equipment; • the suitability of a calibration for its intended application; • the compatibility of measurements with the National Measurement System; and • the traceability of measurement results to the International System of Units (SI). In the development of this National Standard attention has been given to: • expressing the technical requirements for a calibration system supporting both government and industry needs; • applying best practices and experience with related national, international, industry, and government standards; and • balancing the needs and interests of all stakeholders. In addition, this National Standard includes and updates the relevant calibration system requirements for measuring and test equipment described by the previous standards, Part 11 of ANSI/NCSL Z540.1 (R2002) and Military Standard 45662A. This National Standard is written for both Supplier and Customer, each term being interpreted in the broadest sense. The “Supplier may be a producer, distributor, vendor, or a provider of a product, service, or information. The “Customer” may be a consumer, client, enduser, retailer, or purchaser that receives a product or service. Reference to this National Standard may be made by: • customers when specifying products (including services) required; •  suppliers when specifying products offered; •  legislative or regulatory bodies; • agencies or organizations as a contractual condition for procurement; and • assessment organizations in the audit, certification, and other evaluations of calibration systems and their components.

30

quality standards and industry regulations

This National Standard is specific to calibration systems. A calibration system operating in full compliance with this National Standard promotes confidence and facilitates management of the risks associated with measurements, tests, and calibrations.8 Equipment intended for use in potentially explosive atmospheres (ATEX) What are ATEX and IECEx? ATEX (“ATmosphères EXplosibles”, explosive atmospheres in French) is a standard set in the European Union for explosion protection in the industry. ATEX 95 equipment directive 94/9/EC concerns equipment intended for use in potentially explosive areas. Companies in the EU where the risk of explosion is evident must also use the ATEX guidelines for protecting the employees. In addition, the ATEX rules are obligatory for electronic and electrical equipment that will be used in potentially explosive atmospheres sold in the EU as of July 1, 2003. IEC (International Electrotechnical Commission) is a nonprofit international standards organization that prepares and publishes International Standards for electrical technologies. The IEC TC/31 technical committee deals with the standards related to equipment for explosive atmospheres. IECEx is an international scheme for certifying procedures for equipment designed for use in explosive atmospheres. The objective of the IECEx Scheme is to facilitate international trade in equipment and services for use in explosive atmospheres, while maintaining the required level of safety. In most cases, test equipment that is required to be operated in an explosive environment would be qualified and installed by the company’s facility services department and not the calibration personnel. One must also keep in mind that there would be two different avenues for the calibration of those pieces of test equipment: on-site and off-site. If the test instrument that is used in an explosive environment must be calibrated on-site (in the explosive environment), then all the standards used for that calibration must also comply with explosive environment directives. However, if it were possible to remove the test equipment from the explosive environment when due for their period calibration, then there would be no requirement for the standards used for their calibration to meet the explosive

31

quality standards and industry regulations

environment directives, saving money on expensive standards and possibly expensive training of calibration personnel in order for them to work in those conditions. Having said that, there may be a need for the calibration personnel to be aware of the ATEX regulations. An informative website for information on ATEX can be found by typing in the following link: http://ec.europa.eu/enterprise/atex/indexinfor.htm. Several languages are available for retrieving the information. Another informative website is the International Electrotechnical Commission Scheme for Certification to Standards Relating to Equipment for use in Explosive Atmospheres (IECEx Scheme). The link is: http://www.iecex.com/guides.htm. 1. Bucher, Jay L. 2007. The Quality Calibration Handbook. Milwaukee: ASQ Quality Press. 2. The Story of the Egyptian Cubit. http://www.ncsli.org/misc/ cubit.cfm. (18 October, 2008) 3. 21CFR Part 211.68, 211.160: http://www.accessdata.fda.gov/ scripts/cdrh/cfdocs/cfcfr/CFRSearch.cfm?CFRPart=211/ (5 July, 2012) 4. 21CFR Part 11. http://www.fda.gov/downloads/ RegulatoryInformation/Guidances/ucm125125.pdf (5 July, 2012) and http://www.fda.gov/RegulatoryInformation/Guidances/ ucm125067.htm?utm_campaign=Google2&utm_ source=fdaSearch&utm_medium=website&utm_term=21 CFR part 11&utm_content=3 5. GAMP. http://en.wikipedia.org/wiki/Good_Automated_ Manufacturing_Practice (5 July, 2012) 6. NCSL International. 2006. ANSI/NCSL Z540.3-2006. Boulder, CO.

32

34

a basic quality calibration program

A basic quality calibration program

R

&D departments are tasked with coming up with the answers to many problems; the cure for cancer is one of them. Let’s    imagine that the Acme Biotech Co. has found the cure for cancer. Their R&D section sends the formula to their operations & manufacturing division. The cure cannot be replicated with consistent results. They are not using calibrated test instruments in the company. Measurements made by R&D are different than those made by the operations section. If all test equipment were calibrated to a traceable standard, then repeatable results would ensure that what’s made in one part of the company is also repeated in another part of the company. The company loses time, money, their reputation, and possibly the ability to stay in business simply because they do not use calibrated test equipment. A fairy tale? Not hardly. This scenario is repeated every day throughout the world. Without calibration, or by using incorrect calibrations, all of us pay more at the gas pump, for food weighed incorrectly at the checkout counter, and for manufactured goods that do not meet their stated specifications. Incorrect amounts of ingredients in your prescription and over-the-counter (OTC) drugs can cost more, or even cause illness or death. Because of poor or incorrect calibration, criminals are either not convicted or are released on bad evidence. Crime labs cannot identify the remains of victims or wrongly identify victims in the case of mass graves. Airliners fly into mountaintops and off the ends of runways because they don’t know their altitude and/or speed. Babies are not correctly weighed at birth. The amount of drugs confiscated in a raid determines whether the offense is a misdemeanor or a felony; which weight is correct? As one can see, having the correct measurements throughout any and all industries is critical to national and international trade and commerce. 35

a basic quality calibration program

The bottom line is this – all test equipment that make a quantitative measurement require periodic calibration. It is as simple as that. However, before we go any further, we need to clarify two definitions that are critical to this subject – calibration and traceability. By definition: Calibration is a comparison of two measurement devices or systems, one of known uncertainty (your standard) and one of unknown uncertainty (your test equipment or instrument). Traceability is the property of the result of a measurement or the value of a standard whereby it can be related to stated references, usually national or international standards, through an unbroken chain of calibrations all having stated uncertainties. The calibration of any piece of equipment or system is simply a comparison between the standard being used (with its known uncertainty), and the unit under test (UUT) or test instrument that is being calibrated (the uncertainty is unknown, and that is why it is being calibrated). It does not make any difference if you adjust, align or repair the item, nor if you cannot adjust or align it. The comparison to a standard that is more accurate, no matter the circumstances is called calibration. Many people are under the misconception that an item must be adjusted or aligned in order to be calibrated. Nothing could be further from the truth. Before we can get any deeper into what traceability is, we should explain two different traceability pyramids. When we talk about traceability to a national or international standard, the ‘everyday calibration technician’ is usually situated close to the bottom of the pyramid, so a graphic illustration of these pyramids is important. The two examples in figures 1 and 2 are similar, but differ depending on where you are in the chain, or certain parts of the world. There are basically two ways to maintain traceability during calibration – the use of an uncertainty budget (performing uncertainty calculations for each measurement); and using a test uncertainty ratio (TUR) of ≥ 4:1. First, let’s discuss the use of uncertainty budgets. According to the European cooperation for Accreditation of Laboratories, publication reference (EAL-G12) Traceability of Measuring and Test Equipment to National Standards; the purpose of which is to give guidance on the calibration and maintenance of measuring

36

a basic quality calibration program

BIPM NMIs Reference standards Working metrology labs General purpose calibration labs (inside a company) User’s test equipment

Figure 1 SI units Primary stds. Secondary standards Reference standards Working standards User’s test equipment

Figure 2 Note: NMI = National Metrology Institute

equipment in meeting the requirements of the ISO 9000 series of standards for quality systems, and the EN 45001 standard for the operation of testing laboratories; paragraphs 4 and 5 are very specific in their requirements: 4  Why are calibrations and traceability necessary? 4.1 Traceability of measuring and test equipment to national standards by means of calibration is necessitated by the growing national and international demand that manufactured parts be interchangeable; supplier firms that make products, and customers who install them with other parts, must measure with the ‘same measure’.

37

a basic quality calibration program

4.2 There are legal as well as technical reasons for traceability of measurement. Relevant laws and regulations have to be complied with just as much as the contractual provisions agreed with the purchaser of the product (guarantee of product quality) and the obligation to put into circulation only products whose safety, if they are used properly, is not affected by defects. Note: If binding requirements for the accuracy of measuring and test equipment have been stipulated, failure to meet these requirements means the absence of a warranted quality with considerable consequent liability. 4.3 If it becomes necessary to prove absence of liability, the producer must be able to demonstrate, by reference to a systematic and fully documented system, that adequate measuring and test equipment was chosen, was in proper working order and was used correctly for controlling a product. 4.4 There are similar technical and legal reasons why calibration and testing laboratory operators should have consistent control of measuring and test equipment in the manner described. 5  Elements of traceability 5.1 Traceability is characterised by a number of essential elements: (a) an unbroken chain of comparisons going back to a standard acceptable to the parties, usually a national or international standard; (b) measurement uncertainty; the measurement uncertainty for each step in the traceability chain must be calculated according to agreed methods and must be stated so that an overall uncertainty for the whole chain may be calculated; (c) documentation; each step in the chain must be performed according to documented and generally acknowledged procedures; the results must equally be documented; (d) competence; the laboratories or bodies performing one or more steps in the chain must supply evidence for their technical competence, e.g. by demonstrating that they are accredited; (e) reference to SI units; the chain of comparisons must end at primary standards for the realization of the SI units; (f) re-calibrations; calibrations must be repeated at appropriate intervals; the length of these intervals will depend on a number of variables, e.g. uncertainty required, frequency of use, way of use, stability of the equipment. 38

a basic quality calibration program

5.2 In many fields, reference materials take the position of physical reference standards. It is equally important that such reference materials are traceable to relevant SI units. Certification of reference materials is a method that is often used to demonstrate traceability to SI units.1 The other document that goes hand-in-hand with this is EA 4/02, Expression of the Uncertainty of Measurement in Calibration. The purpose of this document is to harmonise evaluation of uncertainty of measurement within EA, to set up, in addition to the general requirements of EAL-R1, the specific demands in reporting uncertainty of measurement on calibration certificates issued by accredited laboratories and to assist accreditation bodies with a coherent assignment of best measurement capability to calibration laboratories accredited by them. As the rules laid down in this document are in compliance with the recommendations of the Guide to the Expression of Uncertainty in Measurement, published by seven international organisations concerned with standardisation and metrology, the implementation of EA-4/02 will also foster the global acceptance of European results of measurement.2 By understanding and following both of these documents, a calibration function can easily maintain traceable calibrations for the requirements demanded by their customers and the standard or regulation that their company needs to meet. To maintain traceability, without using uncertainty budgets or calculations, you must ensure your standards are at least four times (4:1) more accurate than the test equipment being calibrated. Where does this ratio of four to one (4:1) come from? It comes from the American National Standard for Calibration – (ANSI/NCSL Z540.32006) which states: “Where calibrations provide for verification that measurement quantities are within specified tolerances…Where it is not practical to estimate this probability, the TUR shall be equal to or greater than 4:1.” So, if a TUR of equal to or greater than 4:1 is maintained, then traceability is assured. Keep in mind that a TUR of 4:1 somewhere along the chain of calibrations may not have been feasible, and uncertainty calculations were performed and their uncertainty stated on the certificate of calibration. This is correct and acceptable. In most circumstances, where the need to maintain a TUR of 4:1 comes into play, is at the company or shop level, where the customer’s test

39

a basic quality calibration program

equipment is usually used for production or manufacturing purposes only. So how does calibration and traceability fit into the big picture? What does the big picture look like? Why do you need a quality calibration program? You need to establish a quality calibration program to ensure that all operations throughout the metrology department occur in a stable manner. The effective operation of such a system will hopefully result in stable processes and, therefore, in a consistent output from those processes. Once stability and consistency are achieved, it is possible to initiate process improvements. This is applicable in every phase of a production and/or manufacturing program. But especially true in a metrology department.3 Let’s take for example a calibration program that has six calibration technicians on staff. Four of them work in another facility calibrating the same types of equipment as the other two. However, the other two have far more experience and through no fault of their own do not use the calibration procedures that are required by their quality system. They have calibrated the same items for several years and feel there is nothing new to learn. One of the four calibration technicians (who are always following the calibration procedures) finds there is a fast, more economical way to perform a specific calibration. They submit a change proposal for the calibration procedure and everyone is briefed and trained on the new technique. The four calibration technicians that have been following the calibration procedure improve their production and save the company money. The two ‘old timers’ have a reduction in their production and actually cost the company money. If everyone was using the calibration procedures like they were supposed to, then this would not have happened. Process improvements cannot take place across the department if everyone is not doing the job the same way each and every time they perform a calibration. We are not ignorant enough to believe that when calibration technicians have performed a particular calibration hundreds or even thousands of times that they are going to follow calibration procedures word for word. Of course not. But they must have their calibration procedure on hand each time they are performing the calibration. If a change has been made to that procedure, the calibration technician must be trained on the change before they can perform the calibration; and the appropriate documentation completed to show that training

40

a basic quality calibration program

was accomplished and signed off. When the proper training is not documented and signed off by the trainer and trainee, then it is the same as if the training never happened. What is a quality calibration program? A quality calibration program consists of several broad items referred to in the Quality System Regulation (QSR) from the Food and Drug Administration (FDA). These items are also referred to by other standards (ISO 9000, etc.) and regulations throughout most industries that regulate or monitor production and manufacturing of all types of products. One of the most stringent requirements can be found in the current Good Manufacturing Procedures (GMP). The basic premise and foundation of a quality calibration program is to “Say what you do, Do what you say, Record what you did, Check the results, and Act on the difference”. Let’s break these down into simple terms. “Say what you do” means write in detail how to do your job. This includes calibration procedures, work instructions and standard operating procedures (SOPs). “Do what you say” means follow the documented procedures or instructions every time you calibrate, or perform a function that follows specific written instructions. “Record what you did” means that you must record the results of your measurements and adjustments, including what your standard(s) read or indicated both before and after any adjustments might be made. “Check the results” means make certain the test equipment meets the tolerances, accuracies, or upper/lower limits specified in your procedures or instructions. “Act on the difference” means if the test equipment is out of tolerance, you’re required to inform the user/owner of the equipment because they may have to re-evaluate manufactured goods, change a process, or recall a product.3 “Say what you do” means write in detail how to do your job. This includes calibration procedures, work instructions and SOPs. All of your calibration procedures should be formatted the same as other SOPs within your company. Here is an example of common formatting for SOPs:

41

a basic quality calibration program

1.  Procedures 2.  Scope 3.  Responsibilities 4.  Definitions 5.  Procedure 6.  Related Procedures 7.  Forms and Records 8.  Document History After section 4. Definitions, you should have a table listing all of the instruments or systems that would be calibrated by that procedure, along with their range and tolerances. After that you should have a list of the standards to be used to calibrate the items. This table should also include the standard’s range and specifications. Then the actual calibration procedure starts in section 5. Procedures. Manufacturer’s manuals usually provide an alignment procedure that can be used as a template for writing a calibration procedure. They should show what standards accomplish the calibration of a specific range and/or function. A complete calibration must be performed prior to any adjustment or alignment. An alignment procedure and/ or preventive maintenance inspection (PMI) may be incorporated into your SOP as long as it is separate from the actual calibration procedure. There are, generally speaking, two types of calibration procedures: Generic: temperature gages and thermometers, pressure and vacuum gages, pipettes, micrometers, power supplies and water baths. Specific: spectrophotometers, thermal cyclers, and balances/scales. Generic SOPs are written to show how to calibrate a large variety of items in a general context. Specific SOPs are written to show stepby-step procedures for each different type of test instrument within a group of items. Possibly, the calibration form is designed to follow specific steps (number wise); and removes doubt by the calibration technician on what data goes into which data field. “Do what you say” means follow the documented procedures or instructions every time you calibrate, or perform a function that follows specific written instructions. This means following published calibration procedures every time you calibrate a piece of test equipment. Have the latest edition of the procedure available for use by your calibration technicians. Have a system in place for updating your

42

a basic quality calibration program

procedures. Train your technicians on the changes made to your procedures every time the procedure is changed or improved – and document the training. What do you do when you need to make an improvement, or update your calibration procedures and/or forms? A formal, written process must be in place, to include: •  Who can make changes •  Who is the final approval authority •  A revision tracking system •  A process for validating the changes •  An archiving system for old procedures •  Instructions for posting new/removal of old procedures •  A system for training on revisions •  A place to document that training was done “Record what you did” means that you must record the results of your measurements and adjustments, including what your standard(s) read or indicated both before and after any adjustments are made, and keep your calibration records in a secure location. Certain requirements must be documented in each calibration record. Of course there are many ways to accomplish this, including: •  pen and paper •  “do-it-yourself ” databases, e.g. Excel, Access • calibration module of a computerized maintenance management system (CMMS) •  calibration software specifically designed for that purpose These include the identification of the test instrument with a unique identification number, their part number and range/tolerance. The location of where the test instrument can be found should also be on the record. A history of each calibration and a traceability statement or uncertainty budget must be included. The date of calibration, the last time it was calibrated, and the next time it will be due calibration should be on the form. There should be a place to show what the standard read, as well as the test instrument’s ‘As Found’ and when applicable ‘As Left’ readings. The ‘As Found’ readings are what the test instrument read the first time that a calibration is performed, prior to alignment, adjustment or repair. The entire calibration is performed to see any part of the calibration is out of tolerance. If an out-of-tolerance (OOT) condition is found,

43

a basic quality calibration program

record the reading (on the standard and the UUT) and continue with the rest of the calibration to the end of the calibration procedure. If one were to stop at the point where an OOT is found, make an adjustment, then proceed with the calibration, there is a good possibility that the adjustment affected other ranges or parts of the calibration. This is why the entire calibration is performed prior to adjustment or alignment. There will be times when an instrument has a catastrophic failure. It just dies and cannot be calibrated. This should be noted in the calibration record. Then, once the problem is found and repaired, an ‘As Found’ calibration is performed. The UUT is treated the same as any OOT unit, but you would not have been able to collect the original “As Found’ readings. “As Left” readings are taken after repair, alignment, or adjustment. Not all UUTs would be considered OOT when “As Left’ readings are taken. In some circumstances, it might be metrology department policy to adjust an item if it is more than ½ beyond its in-tolerance range, while still meeting its specifications. In this type of situation, after the UUT is adjusted to be as close to optimum as possible, a complete calibration is again performed, collecting the ‘As Left’ readings for the final calibration record. Another example would be when preventive maintenance inspection is going to be performed on an item. The calibration is performed, collecting the ‘As Found’ data. Then the PMI is completed, and an ‘As Left’ set of data is collected. If the item is found to be out-of-tolerance at that time, there would not be a problem since it was found to be in tolerance during the first calibration. It would be obvious that something happened during the cleaning, alignment or adjustment and that after a final adjustment was completed to bring the unit back into tolerance, a final ‘As Left’ calibration would be performed. The standard reading, from the working or reference standard you are using to calibrate the UUT, will also be recorded on the calibration form. Usually, the standard is set at a predetermined output, and the UUT is read to see how much it deviates from the standard. This is a best practice policy that has been in use in the metrology community since calibration started. However, there will be times when this is not possible. One example when it would not be practical to set the standard and take a reading is during the calibration of water baths. The water bath is set to a predetermined temperature, and the temperature standard is used to record the actual reading. Compare this to the calibration of pressure gages where a pressure standard is set to a standard pressure,

44

a basic quality calibration program

and the gage(s) under test are then read, and their pressures recorded on the calibration record, and compared to the standard to see if they are in or out of tolerance. In other case, just as the calibration of autoclaves, they are set to complete a sterilization cycle and a temperature device records all of the temperature readings throughout the cycle and the readings are checked to see if the autoclave met its specifications. The same happens when calibrating thermometers. They, along with the standard, are placed in a dry block and a particular temperature is set. The UUT is compared to the reference after equilibration, and a determination is made as to the in or out of tolerance of the UUT. As can be seen by the above examples, it is not always possible to set the standard and take a reading from the UUT. Also on the calibration form should be an area to identify the standard(s) that were used, along with their next calibration due date(s), plus their specifications and range. There should also be a place to identify which calibration procedure was used, along with the procedure’s revision number. There must be a statement showing traceability to your NMI, or in the case of most companies in the USA, to NIST, or to any artifact that was used as a standard. You should include any uncertainty budgets if used, or at least a statement that a TUR of ≥ 4:1 was met. List environment conditions when appropriate and show if they pass or fail. According to NCSL International Calibration Control Systems for the Biomedical and Pharmaceutical Industry – Recommended Practice RP-6, paragraph 5.11: “The calibration environment need be controlled only to the extent required by the most environmentally sensitive measurement performed in the area.”4 According to ANSI/NCSL Z540.3-2006, paragraph 5.3.6 Influence factors and conditions: “All factors and conditions of the calibration area that adversely influence the calibration results shall be defined, monitored, recorded, and mitigated to meet calibration process requirements. Note: Influencing factors and conditions may include temperature, humidity, vibration, electromagnetic interference, dust, etc. Calibration shall be stopped when the adverse effects of the influence factors ad conditions jeopardize the results of the calibration.”5 If the conditions within the area that calibrations are being performed require monitoring according to the standard or requirements that must be met, then a formal program must be in place for tracking those conditions and reviewing the data. If this is the case, then there should be a place in the calibration form for showing that those conditions were 45

a basic quality calibration program

either met, were not met, or are not applicable to that calibration. You should indicate on the form if the calibration passed or failed. If the UUT had an out-of-tolerance condition, then there should be a place to show what happened to the UUT, with the following possibilities as an example:  • The user/customer was notified and the UUT was adjusted and meets specifications. • The user/customer was notified and the UUT was given a ‘limited calibration’ with their written approval. • The user/customer was notified and the UUT was taken out of service and tagged as unusable. Notice that in each circumstance that the user/customer must be notified of any and all OOTs. This is called for in all of the standards and regulations. The user/customer, even if internal to the company performing the calibrations, must be informed if their test equipment does not meet their specifications. There should be an area set aside in the calibration form for making comments or remarks. Enough space should be available for the calibration technician to include information about the calibration, OOT conditions, what was accomplished if an OOT was found, etc. And finally, the calibration record must be signed and dated by the technician performing the calibration. In some instances, the calibration record requires a ‘second set of eyes’. This means that an individual higher up the chain of command (supervisor, manager, QA inspector, etc.) must review the calibration record and also sign and date that it has been reviewed, audited, or inspected before it is considered a completed record. If this is the case, there should be a place on the form for the final reviewer to sign and date. What do you do if, after recording your results, you find that you have made an error, or transposed the wrong numbers, and want to correct the error? For hard copy records, draw a single line through the entry, write the correct data, and then place your initials and date next to the data using black ink. Do not use white-out, or erase the original data. For making corrections to electronic records (eRecords), use whatever tracking system the software uses; or make a duplicate record from scratch with the correct data and explain in the comments block what happened, and date and sign accordingly. There should be only one way to file your records, both hard copy

46

a basic quality calibration program

and eRecords – no matter which system you use, put it into your written procedures. An example for filing hard copy records: •  Each record is filed by its unique ID number •  Records are filed with the newest in the front •  Records are filed within a specified time frame An example for filing eRecords: • Filed by ID number, calibration certificate number and calibration date •  Placed on a secure drive that has regular backup •  eRecords are filed within a specified time frame There are many different ways to manage your calibration data since there are a variety of ways to collect that data. Hard copy records collected during the calibration of test instruments have been discussed in detail already. But the collection of data by electronic means, or through the use of calibration software, process controllers, etc., should also be considered. Is the system validated and instrumentation qualified prior to use? If you are using any type of computerized system, validation of that software is mandatory. How is the data collected and stored? Is it in its native format or dumped into a spreadsheet for analysis? All of these need to be considered to allow for review, analysis, and/or compilation into your forms, and eventual storage. The use of computerized data collection brings with it not only increased productivity and savings in time and effort; but also new problems in how to collect, manage, review and store the data. It cannot be emphasized enough the criticality of validating your software, data lines and storage systems when going entirely electronic with your calibration records and data management. “Check the results” means make certain the test equipment meets the tolerances, accuracies, or upper/lower limits specified in your procedures or instructions. There are various ways to do this. Calibration forms should have the range and their tolerances listed for each piece of test equipment being calibrated. In some instances it is apparent what the tolerances will be for the items being calibrated. In other cases it is not quite so apparent.

47

a basic quality calibration program

“Act on the difference” means if the test equipment is out of tolerance, you must inform the user because they may have to re-evaluate manufactured goods, change a process or procedure, or recall product. According to the FDA: “When accuracy and precision limits are not met, there shall be provisions for remedial action to reestablish the limits and to evaluate whether there was any adverse effect on the device’s quality.” You should have a written procedure in place that explains in detail: •  What actions are to be taken by the calibration technician? • What actions to be taken by the department supervisor and/or manager? • What actions to be taken by the responsible owner/user of the OOT test equipment? You should have an SOP that explains the responsibilities of the calibration technician: • Do they have additional form(s) to complete when OOT conditions are found? • Do they require a ‘second set of eyes’ when/if an OOT is found? • Have they been trained and signed off that they know all the proper procedures when an OOT has been found? You should have an SOP that explains the responsibilities of the supervisor/manager: • Who notifies the customer – the technician, supervisor or manager? •  Is a data base maintained on all OOT test equipment? • Is the customer/user required to reply to the OOT notification; if so is there a time limit, and a paper trail for historical reference? After owner/user notification, is the calibration department responsible for anything else? • Is the final action by the owner/user sent back for filing or archiving? • Usually the department that generates an action item is responsible for final archiving. • Are there any databases that need to be updated; or upper management notification in case of ‘in action’?

48

a basic quality calibration program

Do you have a database of all OOT test equipment for various activities? • The database can be used for accessing yearly calibration interval analysis. • Access to OOT data can assist in determining reliability of test equipment. • During an audit/inspection (both internal and external) access to past OOT data should be easily available. Here is a hypothetical example: from an historical perspective, generally 85% of test equipment passes calibration. • Among the 15% that are found to be OOT some will be due to

Typical calibration process as shown in a flow chart

Start

‘As found’ test

Save ‘As found’ results

NO

Adjustment required?

‘As Left’ test

YES

YES

Adjust as needed

Within limits?

NO

Save ‘As Left’ results

End

49

a basic quality calibration program

operator error, bad standards, bad cables/accessories, poorly written calibration procedures, environmental conditions (vibration, etc.). • If a higher fail rate is noticed, before changing calibration intervals, check that the proper specifications are being used. Developing a world-class calibration program A quality calibration program might be compared to an iceberg. Only about 10% can be easily seen by the casual observer. However, the unseen portion is what keeps the iceberg afloat and stable in the ocean. The same can be said of a quality calibration program. The “Say what you do, Do what you say, Record what you did, Check the results, and Act on the difference” portion, along with traceability should be apparent to an auditor or inspector. But the different parts that keep a quality calibration program running efficiently consist of elements from a continuous process improvement program, scheduling and calibration management software, an effective training program, a comprehensive calibration analysis program, correct and properly used calibration and equipment labels, and a visible safety program. Without any one of these programs, a quality calibration program would be impossible to maintain. Having an effective calibration management program is usually the difference between being proactive and reactive to performing your routine calibrations. By knowing what is coming due calibration, you can schedule your technicians, standards, time and other resources to the best advantage. This can be compared to the person who is trying to drain the swamp while fighting off the alligators. It is hard to keep your overdue calibrations at a minimum when all of your time is spent reacting to items that keep coming due without your prior knowledge. Any calibration management program worth the money should have a few critical areas built into their basic program. Those include: a master inventory list, reverse traceability, the ability to see a 30 day schedule of items coming due calibration, and the ability to see all items that are currently overdue calibration. From a managerial standpoint, the calibration management program should also be able to show calibrations and repairs by individual items, groups of items by location/part number, items that are OOT, and other listings that help to manage your department. According to most standards and regulations, any software program used must be validated prior to implementation. This can be accomplished using the manufacturer’s

50

a basic quality calibration program

system, or by incorporating an in-house validation system. Either way, your validation paperwork needs to be available for inspection during audits and inspections. A best practice among experienced calibration practitioners is the calibration of like items, and using your scheduling software to also perform calibrations in geographical areas or combining calibrations in local areas. An example of this would be to calibrate all pressure gages that were shown to be stored or used in a specific area, or floor of a building. This would be using your time to the best advantage. Also, if calibrations were to be performed in a ‘clean room’ environment, and the calibration technician is required to gown-up prior to entry every time then go into the clean-room, then scheduling all of the calibrations in that area could increase production and reduce down time from multiple entries and exits. Combining the calibration of like items and mixing and matching items could reduce the task of mundane and boring calibrations. An example would be to start all temperature calibrations (set water baths up for their initial temperature readings), then perform several pipette or balance calibrations, return to set another temperature in the water baths (doing a few at a time), return to finish the pipette or balance calibrations, then complete the water baths at their final setting. By not having to stand around to wait for the water baths to equilibrate, you are using your time more efficiently, increasing productivity, and keeping the calibration technician involved and focused instead of bored. Another critical yet often times misunderstood program is calibration interval analysis. How often should each type of test equipment be calibrated? Should the manufacturer’s recommended interval be the determining factor? Or should the criticality of how the test equipment is used in your particular production or manufacturing line be the deciding vote? Your specific situation should be the driving factor in deciding calibration interval analysis. Most manufacturers recommend a 12 month calibration interval, depending on usage, environment, handling, etc. A particular item used in a controlled environment should be more reliable that one used in a harsher situation, say outdoors in severe weather. Also, you must consider if the test equipment is used to determine final product where specifications are very tight, or used as an item that is coded as “No Calibration Required” on a loading dock. Each situation should be considered carefully so that they can be reviewed in the appropriate light. Calibration interval analysis software can be purchased commercially

51

a basic quality calibration program

and used to evaluate your test equipment. Also, NCSL International has RP-1, Establishment & Adjustment of Calibration Intervals. This Recommended Practice (RP) is intended to provide a guide for the establishment and adjustment of calibration intervals for equipment subject to periodic calibration. It provides information needed to design, implement and manage calibration interval determination, adjustment and evaluation programs. Both management and technical information are presented in this RP. Several methods of calibration interval analysis and adjustment are presented. The advantages and disadvantages of each method are described and guidelines are given to assist in selecting the best method for a requiring organization. A company could also do their own analysis if they support a limited number of items, or are on a tight budget and are willing to do their own computations. Here is an example. • For each type of equipment, collect data over a one year period on: number of calibrations and number of items OOT • Take the number of calibrations minus the number of OOTs, divide result by the number of calibrations, then take the result times 100 for the pass rate • Make a risk assessment of each item for your company’s needs; set a cut off for increasing or decreasing calibration intervals • Consider increasing a calibration interval if the pass rate ≥ 95% (by ½ up to double the current calibration interval) • Consider decreasing a calibration interval if the pass rate ≤ 85% (by ¾ to ½ of the current calibration interval) No matter which route you take for calibration interval analysis – ensure you are on the cutting-edge – not on the ragged-edge by extending your intervals too fast without solid data; recalls can be very expensive, in time and money, and to your company’s reputation! The cost and risk of not calibrating Are there costs and/or risks associated to not calibrating your test equipment? This is a double edged sword. On one side we have the requirement of standards and regulations that govern various companies, industries and even countries. Not only is calibration a requirement, but one of the foundations for any quality system in the 21st century. It isn’t a question of do you have a quality calibration program in place, but

52

a basic quality calibration program

does it comply with all the requirements of the appropriate standard or regulation to which your company must conform? The other side of the double edged sword is having a calibration program in place without any type of quality, traceability or documentation. This would equate to not having any type of calibration program at all. If a manufacturer produces any type of product or service where repeatable measurements take place then their test equipment/instruments need to have repetitive outputs. Without calibration to a traceable standard (national, international, or intrinsic), there can be no repeatability. Therefore there can be no quality in the product, so the company would never be able to stay in business long enough to impact their market segment. So is there cost and risk? Absolutely. The cost is huge in terms of lost production, time, money, and reputation. In the case of companies that have untraceable calibration in the production of medical devices, pharmaceutical drugs and products that impact human safety – the cost could be immeasurable…with the possibility of death among the results. The basic belief is this – it is absolutely essential to have a quality calibration program in place to make a quality product, no matter the size, shape, or quantity. The question that should be asked is: “Do you have a quality calibration program that has traceable results to a national or international standard”? If the answer is yes, then it is assumed that to have a quality calibration program, you must also have all the parts needed to support traceable calibration: calibration procedures, calibration records, traceable documentation, an out-oftolerance program and procedures, document control procedures, a training program, continuous process improvements, a comprehensive calibration management software package, calibration interval analysis, documented training for all your calibration technicians, and the ability to provide quality customer service in a timely manner. Then you can say you have a quality calibration program. But it doesn’t end there. Referring to a double edged sword, what are the responsibilities of a quality calibration department and also those of their customer? A calibration/metrology department should be responsible for: • Listening to their customers to understand their requirements and needs

53

a basic quality calibration program

• Translating those requirements to the accuracy and specifications of the test equipment and support services that meet or exceed their quality expectations • Delivering test equipment that consistently meets requirements for reliable performance • Providing knowledgeable and comprehensive test equipment support • Continuously reviewing and improving their services and processes Your customers should be responsible for: •  Informing Metrology of their requirements and needs • Getting the proper training in the correct and safe usage of test equipment • Maintaining their test equipment without abusing, contaminating or damaging it under normal operating conditions • Using their work order system for requesting service when equipment is broken, malfunctioning, or in need of calibration As Lord Kelvin was quoted as saying, “If you cannot measure it, you cannot improve it.”

1. EAL-G12, Traceability of Measurement. Edition 1, November 1995. 2. EA-4/02, Expression of the Uncertainty of Measurement in Calibration. December 1999 rev00. 3. Bucher, Jay L. 2007. The Quality Calibration Handbook. Milwaukee: ASQ Quality Press. 4. NCSL. 1999. Calibration Control Systems for the Biomedical and Pharmaceutical Industry, RP-6. Boulder, CO. 5. NCSL International. 2006. ANSI/NCSL Z540.3-2006. Boulder, CO.

54

55

traceable and efficient calibrations

Traceable and efficient calibrations in the process industry

T

oday’s modern process plants, production processes and quality systems, put new and tight requirements on the accuracy of process instruments and on process control. Quality systems, such as the ISO9000 and ISO14000 series of quality standards, call for systematic and well-documented calibrations, with regard to accuracy, repeatability, uncertainty, confidence levels etc. Does this mean that the electricians and instrumentation people should be calibration experts? Not really, but this topic should not be ignored. Fortunately, modern calibration techniques and calibration systems have made it easier to fulfill the requirements on instrumentation calibration and maintenance in a productive way. However, some understanding of the techniques, terminology and methods involved in calibration must be known and understood in order to perform according to International Quality Systems.

Calibration can briefly be described as an activity where theinstrument being tested is compared to a known reference value, i.e. calibrator.

1.  What is calibration and why calibrate Calibration can brief ly be described as an activity where the instrument being tested is compared to a known reference value, i.e. calibrator. The keywords here are ‘known reference’, which means that the calibrator used should have a valid, traceable calibration certificate. To be able to answer the question why calibrate, we must first determine what measurement is and why measuring is necessary.

57

traceable and efficient calibrations

WHAT IS MEASUREMENT? In technical standards terms the word measurement has been defined as: “A set of experimental operations for the purpose of determining the value of a quantity.” What is then the value of quantity? According to the standards the true value of a quantity is: “The value which characterizes a quantity perfectly defined during the conditions which exist at the moment when the value is observed. Note: the true value of a quantity is an ideal concept and, in general, it cannot be known.” Therefore all instruments display false indications!

A set of experimental operations for the purpose of determining the value of a quantity.

HIERARCHY OF ACCURACY TRUE VALUE

International National standard Authorized Laboratories

Instr. Departments House and working standards

Process instrumentation

58

traceable and efficient calibrations

2.  Why measure? The purpose of a process plant is to convert raw material, energy, manpower and capital into products in the best possible way. This conversion always involves optimizing, which must be done better than the competitors. In practice, optimization is done by means of process automation. Anyhow, regardless of how advanced the process automation system is, the control cannot be better than the quality of measurements from the process.

3.  Why calibrate The primary reason for calibrating is based on the fact that even the best measuring instruments lack in absolute stability, in other words,

The primary reason for calibrating is based on the fact that even the best measuring instruments lack in absolute stability, in other words, they drift and lose their ability to give accurate measurements.

EVERYTHING IS BASED ON MEASUREMENTS

PROCESS CONTROL SYSTEM MEASUREMENTS

CONTROLS

INSTRUMENTATION MEASUREMENTS

Production Factors

ADJUSTMENTS

PROCESS

Products

59

traceable and efficient calibrations

they drift and lose their ability to give accurate measurements. This drift makes recalibration necessary. Environment conditions, elapsed time and type of application can all affect the stability of an instrument. Even instruments of the same manufacturer, type and range can show varying performance. One unit can be found to have good stability, while another performs differently. Other good reasons for calibration are:

Environment conditions, elapsed time and type of application can all affect the stability of an instrument.

• To maintain the credibility of measurements • To maintain the quality of process instruments at a good-as-new level • Safety and environmental regulations • ISO9000, other quality systems and regulations The ISO9000 and ISO14000 can assist in guiding regular, systematic calibrations, which produces uniform quality and minimizes the negative impacts on the environment.

QUALITY MAINTENANCE QUALITY QP

C1 C2

C1–C7 CALIBRATIONS C3

C4

“GOOD AS NEW” C5 C6 C7 QM LOWER TOLERANCE

Q1 Q2 Q3

QZM

PURCHASE

T1

T2

QP – PURCHASED QUALITY QZM – ZERO MAINTAINED QUALITY QM – MAINTAINED QUALITY

60

T3

TIME

traceable and efficient calibrations

4.  Traceability Calibrations must be traceable. Traceability is a declaration stating to which national standard a certain instrument has been compared.

5.  Regulatory requirements for calibration 5.1  ISO9001: 2008 The organization determines the monitoring and measurements to be performed, as well as the measuring devices needed to provide evidence of a product’s conformity to determined standards. The organization establishes the processes for ensuring that measurements and monitoring are carried out and are carried out in a manner consistent with the monitoring and measurement requirements. Where necessary, to ensure valid results, measuring equipment is calibrated or verified with measurement standards traceable to national or international standards at specified intervals. If no such standards exist, the basis used for calibration or verification is recorded; adjusted or re-adjusted as necessary; identified for the determining of the calibration status; safeguarded against adjustments that would invalidate the measurement result; protected from damage and deterioration during handling, maintenance and storage. In addition, the organization assesses and records the validity of the previous measuring results when the equipment is found not to conform to requirements. The organization then takes appropriate action on the equipment and any product affected. Records of the calibration and verification results are then maintained. When used in the monitoring and measurement of specified requirements, the ability of computer software to satisfy the intended application is confirmed. This is done prior to initial use and reconfirmed as necessary.

SI-UNITS International standards

National standards

Reference standards

Working standards

Process standards

Note: See ISO 10012 for further information.

61

traceable and efficient calibrations

5.2  PHARMACEUTICAL (FDA, U.S. Food and Drug Administration) Any pharmaceutical company that sells their products in the USA must comply with FDA regulations, regardless of where the products are manufactured.

Software systems need features such as Electronic Signature, Audit Trail, User Management, and Security System to be able to comply with these regulations.

• Calibration records must be maintained. • Calibrations must be done in accordance with written, approved procedures. • There should be a record of the history of each instrument. • All instrumentation should have a unique ID; all product, process and safety instruments should be physically tagged. • A calibration period and error limits should be defined for each instrument. • Calibration standards should be traceable to national and international standards. • Calibration standards must be more accurate than the required accuracy of the equipment being calibrated. • All instruments used must be fit for purpose. • There must be documented evidence that personnel involved in the calibration process have been trained and are competent. • Documented change management system must be in place. • All electronic systems must comply with FDA’s 21 CFR Part 11. • All of the above should be implemented in conjunction with following regulations: – 21 CFR Part 211 – “Current Good Manufacturing Practice for Finished Pharmaceuticals” – 21 CFR Part 11 – “Electronic Records; Electronic Signatures” Software systems need features such as Electronic Signature, Audit Trail, User Management, and Security System to be able to comply with these regulations. In such a system, the Electronic Signature is considered equivalent to a hand-written signature. Users must understand their responsibilities once they give an electronic signature. An Audit Trail is required to support change management. Audit Trails should record all modifications, which add, edit, or delete data from an electronic record.

62

traceable and efficient calibrations

5.3 PHARMACEUTICAL (EU GMPs) Any pharmaceutical company that sells their products in the European Union must comply with EU GMPs, including Annex 11, regardless of where the products are manufactured. The requirements for EU GMPs are similar to those of the US FDA, as described in Section 5.2.

6.  DEFINITIONS OF METROLOGICAL TERMS Some metrological terms in association with the concept of calibration are described in this section. Quite a few of the following terms are also used on specification sheets for calibrators. Please note that the definitions listed here are simplified. Calibration An unknown measured signal is compared to a known reference signal.

Validation of measure­ ment and test methods (procedures) is generally necessary to prove that the methods are suitable for the intended use.

Validation Validation of measurement and test methods (procedures) is generally necessary to prove that the methods are suitable for the intended use. Non-linearity Non-linearity is the maximum deviation of a transducer’s output from a defined straight line. Non-linearity is specified by the Terminal Based method or the Best Fit Straight Line method. Resolution Resolution is the smallest interval that can be read between two readings.

63

traceable and efficient calibrations

Sensitivity Sensitivity is the smallest variation in input, which can be detected as an output. Good resolution is required in order to detect sensitivity. Hysteresis The deviation in output at any point within the instrument’s sensing range, when first approaching this point with increasing values, and then with decreasing values.

Stability is expressed as the change in percentage in the calibrated output of an instrument over a specified period, usually 90 days to 12 months, under normal operating conditions.

Repeatability Repeatability is the capability of an instrument to give the same output among repeated inputs of the same value over a period of time. Repeatability is often expressed in the form of standard deviation. Temperature coefficient The change in a calibrator’s accuracy caused by changes in ambient temperature (deviation from reference conditions). The temperature coefficient is usually expressed as % F.S. / °C or % of RDG/ °C. Stability Often referred to as drift, stability is expressed as the change in percentage in the calibrated output of an instrument over a specified period, usually 90 days to 12 months, under normal operating conditions. Drift is usually given as a typical value. Accuracy Generally accuracy figures state the closeness of a measured value to a known reference value. The accuracy of the reference value is generally not included in the figures. It must also be checked if errors like non-linearity, hysteresis, temperature effects etc. are included in the accuracy figures provided. Accuracy is usually expressed % F.S. or % of RDG + adder. The difference between these two expressions is great. The only way to compare accuracy presented in different ways is to calculate the total error at certain points. 

64

traceable and efficient calibrations

Uncertainty Uncertainty is an estimate of the limits, at a given cover factor (or confidence level), which contain the true value. Uncertainty is evaluated according to either a “Type A” or a “Type B” method. Type A involves the statistical analysis of a series of measurements. In this case, uncertainty is calculated using Type A uncertainties, i.e. the effects of these components include measurement errors, which can vary in magnitude and in sign, in an unpredictable manner. The other group of components, Type B, could be said to be of a systematic nature. Systematic errors or effects remain constant during the measurement. Examples of systematic effects include errors in reference value, set-up of the measuring, ambient conditions, etc. Type B uncertainty is used when the uncertainty of a single measurement is expressed. It should be noted that, in general, errors due to observer fallibility cannot be accommodated within the calculation of uncertainty. Examples of such errors include: errors in recording data, errors in calculation, or the use of inappropriate technology.

It should be noted that, in general, errors due to observer fallibility cannot be accommodated within the calculation of uncertainty.

Type A uncertainty The type A method of calculation can be applied when several independent measurements have been made under the same conditions. If there is sufficient resolution in the measurement, there will be an observable difference in the values measured. The standard deviation, often called the “root-mean-square repeatability error”, for a series of measurements under the same conditions, is used for calculation. Standard deviation is used as a measure of the dispersion of values. Type B uncertainty Type B evaluation of uncertainty involves the use of other means to calculate uncertainty, rather than applying statistical analysis of a series of measurements. It involves the evaluation of uncertainty using scientific judgement based on all available information concerning the possible variables. Values belonging to this category may be derived from:

65

traceable and efficient calibrations

• Experience with or general knowledge of the behavior and properties of relevant materials and instruments • Ambient temperature • Humidity • Local gravity • Atmospheric pressure • Uncertainty of the calibration standard • Calibration procedures • Method used to register calibration results • Method to process calibration results

For uncertainty specifications, there must be a clear state­ment of cover probability or confidence level.

The proper use of the available information calls for insight based on experience and general knowledge. It is a skill that can be learnt with practice. A well-based Type B evaluation of uncertainty can be as reliable as a Type A evaluation of uncertainty, especially in a measurement situation where a Type A evaluation is based only on a comparatively small number of statistically independent measurements. Expanded uncertainty The EA has decided that calibration laboratories accredited by members of the EA shall state an expanded uncertainty of measurement obtained by multiplying the uncertainty by a coverage factor k. In cases where normal (Gaussian) distribution can be assumed, the standard coverage factor, k=2, should be used. The expanded uncertainty corresponds to a coverage probability (or confidence level) of approximately 95%. For uncertainty specifications, there must be a clear statement of cover probability or confidence level. Usually one of the following confidence levels are used: 1 s = 68% 2 s = 95% 3 s = 99%

7.  CALIBRATION MANAGEMENT Many companies do not pay enough attention to calibration management although it is a requirement e.g. in ISO9001: 2008. The maintenance management system may alert when calibration is

66

traceable and efficient calibrations

needed and then opens up a work order. Once the job has been done, the work order will close and the maintenance system will be satisfied. Unfortunately, what happens between opening and closing of the work order is not documented very often. If something is documented, it is usually in the form of a hand-written sheet that is then archived. If the calibration results need to be examined at a later time, finding the sheets requires a lot of effort. Choosing professional tools for maintaining calibration records and doing the calibrations can save a lot of time, effort and money. An efficient calibration management system consists of calibration management software and documenting calibrators. Modern calibration management software can be a tool that automates and simplifies calibration work at all levels. It automatically creates a list of instruments waiting to be calibrated in the near future. If the software is able to interface with other systems the scheduling of calibrations can be done in the maintenance system from which the work orders can be automatically loaded into the calibration management software. When the technician is about to calibrate an instrument, (s)he simply downloads the instrument details from the calibration management software into the memory of a documenting calibrator; no printed notes, etc. are needed. The “As Found” and “As Left” are saved in the calibrator’s memory, and there is no need to write down anything with pen. The instrument’s measurement ranges and error limits are defined in the software and also downloaded to the calibrator. Thus the calibrator is able to detect if the calibration was passed or failed immediately after the last calibration point was recorded. There is no need to make tricky calculations manually in the field. All this saves an extensive amount of time and prevents the user from making mistakes. The increase in work productivity allows for more calibrations to be carried out within the same period of time as before. Depending on what process variable is calibrated and how many calibration points are recorded, using automated tools can be 5 to 10 times faster compared to manual recording. While the calibration results are uploaded onto the database, the software automatically detects the calibrator that was used, and the traceability chain is documented without requiring any further actions from the user. Calibration records, including the full calibration history of an

The instrument’s measurement ranges and error limits are defined in the software and also downloaded to the calibrator.

67

traceable and efficient calibrations

Implementing a modern calibration management system benefits everybody who has anything to do with instrumentation.

68

instrument, are kept in the database; therefore accessing previous results is also possible in just a few seconds. When an instrument has been calibrated several times, software displays the “History Trend”, which assists in determining whether or not the calibration period should be changed. One of today’s trends is to move towards to a paperless office. If the calibration management software includes the right tools, it is possible to manage calibration records on computer without producing any papers. If paper copies of certificates are preferred, printing them must, of course, be possible. When all calibration related data is located in a single database the software is obviously able to create calibration related reports and documents. Today’s documenting calibrators are capable of calibrating many process signals. It is not very uncommon to have a calibrator that calibrates pressure, temperature and electrical signals including frequency and pulses. In addition to the conventional mA output of a transmitter, modern calibrators can also read HART, Foundation Fieldbus or Profibus output of the transmitters, and they can be even used for configuring these “smart” transmitters. Implementing a modern calibration management system benefits everybody who has anything to do with instrumentation. For instance the maintenance manager can use it as a calibration planning and decision-making tool for tracking and managing all calibration related activities. When an auditor comes for a visit, QA will find a calibration management system useful. The requested calibration records can be viewed on screen with a couple mouse clicks. If a calibrator drifts out of its specifications, it is possible to use a “reverse traceability report” to get a list of instruments that have been calibrated with that calibrator. Good calibration tools help technicians work more efficiently and accurately. If the system manufacturer has paid attention usability, the system is easy to learn and use. When many tasks are automated, the users can concentrate on their primary job. Transferring to a new calibration system may sound like a huge task and it can be a huge task. There are probably thousands of instruments that need to be entered into the database and all the details must be checked and verified before the system is up and running. Although there is a lot of data involved, it does not mean the job is an enormous one.

traceable and efficient calibrations

Nowadays most companies have instrumentation data in some type of electronic format: as Excel spreadsheets, Maintenance databases, etc. The vendor of the calibration system is most likely able to import most of the existing data to the calibration database saving months of work. CONCLUSION

A good, automated calibration system reduces workload because it carries out tasks faster, more accurately and with better results than what could be reached with a manual system. It assists in documenting, scheduling, planning, analyzing and finally optimizing the calibration work.

A good, automated calibration system reduces workload.

References [1] ISO9001: 2008 “Quality Management Systems. Requirements” [2] 21 CFR Part 11: “Electronic Records; Electronic Signatures” [3] 21 CFR Part 211: “Current Good Manufacturing Practice for Finished Pharmaceuticals”

69

Calibration Management and Maintenance

why calibrate

Why Calibrate? What is the risk of not calibrating?

C

alibration can be briefly described as an activity where the instrument being tested is compared to a known reference value. At the simplest level, calibration is a comparison between measurements – one of known magnitude or correctness made or set with one device, and another measurement made in as similar a way as possible with a second device. The device with the known or assigned correctness is called the standard. The second device is the unit under test or test instrument. Calibration is often required with a new instrument or when a specified time period or a specified number of operating hours has elapsed. In addition, calibration is usually carried out when an instrument has been subjected to an unexpected shock or vibration that may have put it out of its specified limits.

Although drift cannot be completely eliminated, it can be discovered and rectified via calibration.

Calibration in industrial applications When a sensor or instrument experiences temperature variations or physical stress over time, its performance will invariably begin to decline, which is known as ‘drift’. This means that measurement data from the sensor becomes unreliable and could even affect the quality of a company’s production. Although drift cannot be completely eliminated, it can be discovered and rectified via calibration. The purpose of calibration is to determine how accurate an instrument or sensor is. Although most instruments provide high accuracy these days, regulatory bodies often need to know just how inaccurate a particular instrument is and whether it drifts in and out of specified tolerance over time.

73

why calibrate

The costs and risks of not calibrating

Even the highest quality instruments will drift over time and lose their ability to provide accurate measurements.

Unfortunately, calibration has costs associated with it and in uncertain economic times, this activity can often become neglected or the interval between calibration checks on instruments can be extended in order to cut costs or simply through a lack of resources or manpower. However, neglecting calibration can lead to unscheduled production or machine downtime, product and process quality issues or even product recalls and rework. Furthermore, if the instrument is critical to a process or is located in a hazardous area, allowing that sensor to drift over time could potentially result in a risk to employee safety. Similarly, an end product manufactured by a plant with poorly calibrated instruments could present a risk to both consumers and customers. In certain situations, this may even lead to a company losing its license to operate due to company not meeting its regulatory requirements. This is particularly true for the food and beverage sector and for pharmaceutical manufacturers. Weighing instruments also need to be calibrated regularly. Determining the correct mass of a product or material is particularly important for companies that supply steel, paper and pulp, power, aviation companies, harbors and retail outlets, who invoice customers based on the mass of what they supply (fiscal metering). These companies need to prove not only that the mass is accurate but also that the equipment producing the readings was correctly calibrated. Invoicing in these industries is often based on process measurements. There is therefore a growing need to have the metrological quality of these weighing instruments confirmed by calibration. Product manufacturing also depends on accurate masses and so laboratories and production departments in the food and beverage, oil and gas, energy, chemical and pharmaceutical industries, also need to calibrate their weighing instruments. Why is calibration important? Calibration ensures that instrument drift is minimized. Even the highest quality instruments will drift over time and lose their ability to provide accurate measurements. It is therefore critical that all instruments are calibrated at appropriate intervals. The stability of an instrument very much depends on its application and the environment it operates in. Fluctuating temperatures, harsh

74

why calibrate

manufacturing conditions (dust and dirt) and elapsed time are all contributing factors here. Even instruments manufactured by the same supplier can vary in their performance over time. Calibration also ensures that product or batch quality remains high and consistent over time. Quality systems such as ISO 9001, ISO 9002 and ISO 14001 require systematic, well-documented calibrations with respect to accuracy, repeatability, uncertainty and confidence levels. This affects all process manufacturers. Armando Rivero Rubalcaba is head of Instrumentation at beer producer Heineken (Spain). He comments: “For Heineken, the quality of the beer is a number one priority. All the plants in Spain have received ISO 9001 and ISO 14001 certifications, in addition to the BRC certificate of food safety. We must therefore ensure that all processes correspond to the planned characteristics. The role of calibration is very important to ensure the quality and safety of the processes.” Pharmaceutical manufacturers must follow current Good Manufacturing Practices, GMP, requires that calibration records are maintained and calibrations have to be carried out in accordance with written, approved procedures. Typically, each instrument has a master history record and a unique ID. All product, process and safety instruments should also be physically tagged. Furthermore, a calibration interval and error limits should be defined for each instrument and standards should be traceable to national and international standards. Standards must also be more accurate than the required accuracy of the equipment being calibrated. On the people side, there must be documented evidence that employees involved in the calibration process have been properly trained and competent. The company must also have a documented change management system in place, with all electronic systems complying with FDA regulations 21 CFR Part 11. In the power generation, energy and utilities industries, instrument calibration can help to optimize a company’s production process or to increase the plant’s production capacity. For example, at the Almaraz Nuclear Power Plant in Spain, by improving the measurement of reactor power parameters from 2% to 0.4%, enabled the reactor power in each unit to be increased by 1.6%, which has a significant effect on annual production capacity. Safety is another important reason to calibrate instruments. Production environments are potentially high risk areas for employees and can involve high temperatures and high pressures. Incorrect measurements in a hazardous area could lead to serious consequences,

The role of calibration is very important to ensure the quality and safety of the processes.

75

why calibrate

Calibration is of great importance, especially from the viewpoint of production safety and quality of the final product.

76

particularly in the oil and gas, petrochemicals and chemicals sectors. Similarly, manufacturers of food and beverage or pharmaceutical products could put their customers’ lives at risk by neglecting to calibrate their process instruments. Heikki Karhe is a measurement technician at the tyre manufacturer Nokian Tyres. As he puts it: “Calibration is of great importance, especially from the viewpoint of production safety and quality of the final product. Preparation of the right rubber mixture is precision work and a sample is taken from each rubber mixture to ensure quality. Measuring instruments that yield wrong values could easily ruin the final product. The factory is also full of pressure instruments and so it is also important for the safety of the workers that those instruments show the right values.” Neglecting to calibrate process instruments can also affect a company’s bottom line profits. This is particularly true if sales invoicing is based on accurate process measurements, for example, weighing scales or gas conversion devices. Indeed, according to recent research by Nielsen Research/ ATS Studies, poor quality calibration is on average costing manufacturers more than 1.7 million US dollars every year. When only large companies with revenues of more than 1 billion US dollars are considered, this figure rises dramatically to more than 4 million US dollars per year. Proper invoicing is therefore critical to energy and utilities companies. As Jacek Midera, measurement specialist at Mazovian Gas Company states: “Most importantly, accurate measurements ensure proper billing. The impact of even a small measurement error can be tremendous in terms of lost revenue. Customers want to pay for the exact amount of gas they’ve received. Therefore, gas conversion devices must be extremely accurate in measuring delivered gas. This means that requirements for the calibrators are especially high.” Today, controlling emissions is another critical factor for many process manufacturers. Calibrating instruments can help to make combustion more efficient in industrial ovens and furnaces. The latest Government regulations relating to carbon emissions may also require that companies calibrate specific instruments on a regular basis, including sensors used for measuring CO2 and NOX emissions. As Ed de Jong, Instrument Maintenance Engineer at Shell (Netherlands) explains: “Until recently, calibration was mainly driven by economic motives: even the smallest of errors in delivery quantities are unacceptable in Shell’s operation due to the vast sums of money

why calibrate

involved for both customers and governments [fiscal metering]. Nowadays, calibration has an important role especially for the license to operate. Government regulations demand that specific instruments must be calibrated, for example, instruments related to CO2 and NOX emissions.” Common misconceptions There are some common misconceptions when it comes to instrument calibration. For example, some manufacturers claim that they do not need to calibrate their fieldbus instruments because they are digital and so are always accurate and correct. This is simply not true. The main difference between fieldbus and conventional transmitters is that the output signal is a fully digital fieldbus signal. Changing the output signal does not change the need for periodic calibration. Although fieldbus transmitters have been improved in terms of their measurement accuracy when compared to analogue transmitters, this does not eliminate the need for calibration. Another common misunderstanding is that new instruments do not require calibration. Again, this is not true. Just because a sensor is newly installed does not mean that it will perform within the required specifications. By calibrating an instrument before installation, a company is able to enter all the necessary instrument data to its calibration database or calibration management software, as well as begin to monitor the stability or drift of the instrument over time.

The most effective method of determining when an instrument requires calibrating is to use some sort of history trend analysis.

When to calibrate Due to drift, all instruments require calibrating at set intervals. How often they are calibrated depends on a number of factors. First, the manufacturer of the instrument will provide a recommended calibration interval. This interval may be decreased if the instrument is being used in a critical process or application. Quality standards may also dictate how often a pressure or temperature sensor needs calibrating. The most effective method of determining when an instrument requires calibrating is to use some sort of history trend analysis. The optimal calibration interval for different instruments can only be determined with software-based history trend analysis. In this way, highly stable sensors are not calibrated as often as those sensors that are more susceptible to drift.

77

78

why use software for calibration management

Why use software for calibration management?

E

very manufacturing plant has some sort of system in place for managing instrument calibration operations and data. Plant instrumentation devices such as temperature sensors, pressure transducers and weighing instruments – require regular calibration to ensure they are performing and measuring to specified tolerances. However, different companies from a diverse range of industry sectors use very different methods of managing these calibrations. These methods differ greatly in terms of cost, quality, efficiency, and accuracy of data and their level of automation. Calibration software is one such tool that can be used to support and guide calibration management activities, with documentation being a critical part of this. But in order to understand how software can help process plants better manage their instrument calibrations, it is important to consider the typical calibration management tasks that companies have to undertake. There are five main areas here, comprising of planning and decision-making; organisation; execution; documentation; and analysis. Careful planning and decision-making is important. All plant instruments and measurement devices need to be listed, then classified into ‘critical’ and ‘non-critical’ devices. Once this has been agreed, the calibration range and required tolerances need to be identified. Decisions then need to be made regarding the calibration interval for each instrument. The creation and approval of standard operating procedures (SOPs) for each device is then required, followed by the selection of suitable calibration methods and tools for execution of these methods. Finally, the company must identify current calibration status for every instrument across the plant.

Calibration software is one such tool that can be used to support and guide calibration management activities, with documentation being a critical part of this.

79

why use software for calibration management

All plant instruments and measurement devices need to be listed, then classified into ‘critical’ and ‘non-critical’ devices.

The next stage, organisation, involves training the company’s calibration staff – typically maintenance technicians, service engineers, process and quality engineers and managers – in using the chosen tools and how to follow the approved SOPs. Resources then have to be organised and assigned to actually carry out the scheduled calibration tasks. The execution stage involves supervising the assigned calibration tasks. Staff carrying out these activities must follow the appropriate instructions before calibrating the device, including any associated safety procedures. The calibration is then executed according to the plan, although further instructions may need to be followed after calibration. The documentation and storage of calibration results typically involves signing and approving all calibration records that are generated. The next calibration tasks then have to be scheduled, calibration labels need to be created and pasted, then created documents copied and archived. Based on the calibration results, companies then have to analyse the data to see if any corrective action needs to be taken. The effectiveness of calibration needs to be reviewed and calibration intervals checked. These intervals may need to be adjusted based on archived calibration history. If, for example, a sensor drifts out of its specification range, the consequences could be disastrous for the plant, resulting in costly production downtime, a safety problem or leading to batches of inferior quality goods being produced, which may then have to be scrapped. Documentation Documentation is a very important part of a calibration management process. ISO 9001:2008 and the FDA both state that calibration records must be maintained and that calibration must be carried out according to written, approved procedures. This means an instrument engineer can spend as much as 50 per cent of his or her time on documentation and paperwork – time that could be better spent on other value-added activities. This paperwork typically involves preparing calibration instructions to help field engineers; making notes of calibration results in the field; and documenting and archiving calibration data. Imagine how long and difficult a task this is if the plant has thousands of instruments that require calibrating on at least a sixmonthly basis? The amount of manual documentation increases almost exponentially!

80

why use software for calibration management

When it comes to the volume of documentation required, different industry sectors have different requirements and regulations. In the Power & Energy sector, for example, just under a third of companies (with 500+ employees) typically have more than 5,000 instruments that require calibrating. 42 per cent of companies perform more than 2,000 calibrations each year. In the highly regulated pharmaceuticals sector, a massive 75 per cent of companies carry out more than 2,000 calibrations per year. Oil, Gas & Petrochemicals is similarly high, with 55 per cent of companies performing more than 2,000 calibrations each year. The percentage is still quite high in the food & beverage sector, where 21 per cent of firms said they calibrated their instruments more than 2,000 times every year. This equates to a huge amount of paperwork for any process plant. The figures outlined appear to suggest that companies really do require some sort of software tool to help them manage their instrument calibration processes and all associated documentation. However, the picture in reality can be very different. Only a quarter of companies use calibration software In Beamex’s own Calibration Study carried out recently, a mere 25 per cent of companies with 500+ employees (across the industry sectors mentioned above) said that they did use specialist calibration management software. Many other companies said that they relied on generic spreadsheets and/or databases for this, whilst others used a calibration module within an existing Computerised Maintenance Management System (CMMS). A significant proportion (almost 20 per cent) of those surveyed said they used a manual, paper-based system. Any type of paper-based calibration system will be prone to human error. Noting down calibration results by hand in the field and then transferring these results into a spreadsheet back at the office may seem archaic, but many firms still do this. Furthermore, analysis of paper-based systems and spreadsheets can be almost impossible, let alone time consuming. In a recent survey conducted by Control Magazine, 40 per cent of companies surveyed said that they calculated calibration intervals by using historical trend analysis – which is encouraging. However, many of these firms said they were doing it without any sort of calibration software to assist them. The other 60 per cent of companies determined

This means an instrument engineer can spend as much as 50 per cent of his or her time on documentation and paperwork – time that could be better spent on other value-added activities.

81

why use software for calibration management

Using software for calibration management enables faster, easier and more accurate analysis of calibration records and identifying historical trends.

instrument calibration intervals based on either the manufacturer’s own recommendation, or they used a uniform interval across the plant for all instruments. Neither method is ideal in practice. Companies could save so much time and reduce costs by using calibration management software to analyse historical trends and calibration results. Using software for calibration management enables faster, easier and more accurate analysis of calibration records and identifying historical trends. Plants can therefore reduce costs and optimise calibration intervals by reducing calibration frequency when this is possible, or by increasing the frequency where necessary. For example, for improved safety, a process plant may find it necessary to increase the frequency of some sensors that are located in a hazardous, potentially explosive area of the manufacturing plant. Just as important, by analysing the calibration history of a flow meter that is located in a ‘non-critical’ area of the plant, the company may be able to decrease the frequency of calibration, saving time and resources. Rather than rely on the manufacturer’s recommendation for calibration intervals, the plant may be able to extend these intervals by looking closely at historical trends provided by calibration management software. Instrument ‘drift’ can be monitored closely over a period of time and then decisions taken confidently with respect to amending the calibration interval. Regardless of industry sector, there seems to be some general challenges that companies face when it comes to calibration management. The number of instruments and the total number of periodic calibrations that these devices require can be several thousand per year. How to plan and keep track of each instrument’s calibration procedures means that planning and scheduling is important. Furthermore, every instrument calibration has to be documented and these documents need to be easily accessible for audit purposes. Paper-based systems These systems typically involve hand-written documents. Typically, this might include engineers using pens and paper to record calibration results while out in the field. On returning to the office, these notes are then tidied up or transferred to another paper document, after which they are archived as paper documents. While using a manual, paper-based system requires little or no

82

why use software for calibration management

investment, it is very labour-intensive and means that historical trend analysis becomes very difficult to carry out. In addition, the calibration data is not easily accessible. The system is time consuming, soaks up a lot of resources and typing errors are commonplace. Dual effort and re-keying of calibration data are also significant costs here. In-house legacy systems (spreadsheets, databases, etc.) Although certainly a step in the right direction, using an in-house legacy system to manage calibrations has its drawbacks. In these systems, calibration data is typically entered manually into a spreadsheet or database. The data is stored in electronic format, but the recording of calibration information is still time-consuming and typing errors are common. Also, the calibration process itself cannot be automated. For example, automatic alarms cannot be set up on instruments that are due for calibration. Calibration module of a CMMS Many plants have already invested in a Computerised Maintenance Management (CMM) system and so continue to use this for calibration management. Plant hierarchy and works orders can be stored in the CMM system, but the calibration cannot be automated because the system is not able to communicate with ‘smart’ calibrators. Furthermore, CMM systems are not designed to manage calibrations and so often only provide the minimum calibration functionality, such as the scheduling of tasks and entry of calibration results. Although instrument data can be stored and managed efficiently in the plant’s database, the level of automation is still low. In addition, the CMM system may not meet the regulatory requirements (e.g. FDA) for managing calibration records.

Regardless of industry sector, there seems to be some general challenges that companies face when it comes to calibration management.

Calibration software With specialist calibration management software, users are provided with an easy-to-use Windows Explorer-like interface. The software manages and stores all instrument and calibration data. This includes the planning and scheduling of calibration work; analysis and optimisation of calibration frequency; production of reports, certificates and labels; communication with smart calibrators; and

83

why use software for calibration management

easy integration with CMM systems such as SAP and Maximo. The result is a streamlined, automated calibration process, which improves quality, plant productivity and efficiency. Benefits of using calibration software

Using software for calibration management enables faster, easier and more accurate analysis of calibration records and identifying historical trends.

84

With software-based calibration management, planning and decisionmaking are improved. Procedures and calibration strategies can be planned and all calibration assets managed by the software. Position, device and calibrator databases are maintained, while automatic alerts for scheduled calibrations can be set up. Organisation also improves. The system no longer requires pens and paper. Calibration instructions are created using the software to guide engineers through the calibration process. These instructions can also be downloaded to a technician’s handheld documenting calibrator while they are in the field. Execution is more efficient and errors are eliminated. Using software-based calibration management systems in conjunction with documenting calibrators means that calibration results can be stored in the calibrator’s memory, then automatically uploaded back to the calibration software. There is no re-keying of calibration results from a notebook to a database or spreadsheet. Human error is minimised and engineers are freed up to perform more strategic analysis or other important activities. Documentation is also improved. The software generates reports automatically and all calibration data is stored in one database rather than multiple disparate systems. Calibration certificates, reports and labels can all be printed out on paper or sent in electronic format. Analysis becomes easier too, enabling engineers to optimise calibration intervals using the software’s History Trend function. Also, when a plant is being audited, calibration software can facilitate both the preparation and the audit itself. Locating records and verifying that the system works is effortless when compared to traditional calibration record keeping. Regulatory organisations and standards such as FDA and ISO place demanding requirements on the recording of calibration data. Calibration software has many functions that help in meeting these requirements, such as Change Management, Audit Trail and Electronic Signature functions. The Change Management feature in Beamex’s CMX software, for example, complies with FDA requirements.

Business benefits For the business, implementing software-based calibration management means overall costs will be reduced. These savings come from the now-paperless calibration process, with no manual documentation procedures. Engineers can analyse calibration results to see whether the calibration intervals on plant instruments can be altered. For example, those instruments that perform better than expected may well justify a reduction in their calibration frequency. Plant efficiencies should also improve, as the entire calibration process is now streamlined and automated. Manual procedures are replaced with automated, validated processes, which is particularly beneficial if the company is replacing a lot of labour-intensive calibration activities. Costly production downtime will also be reduced. Even if a plant has already implemented a CMM system, calibration management software can be easily integrated to this system. If the plant instruments are already defined on a database, the calibration management software can utilise the records available in the CMM system database. The integration will save time, reduce costs and increase productivity by preventing unnecessary double effort and re-keying of works orders in multiple systems. Integration also enables the plant to automate its calibration management with smart calibrators, which simply is not possible with a standalone CMM system. Benefits for all process plants Beamex’s suite of calibration management software can benefit all sizes of process plant. For relatively small plants, where calibration data is needed for only one location, only a few instruments require calibrating and where regulatory compliance is minimal, Beamex CMX Light is the most appropriate software. For medium-to-large sized companies that have multiple users who have to deal with a large amount of instruments and calibration work, as well as strict regulatory compliance, Beamex CMX Professional is ideal. Beamex’s high-end solution, CMX Enterprise, is suitable for process manufacturers with multiple global sites, multilingual users and a very large amount of instruments that require calibration. Here, a central calibration management database is often implemented that is used by multiple plants across the world.

CHECKLIST Choosing the right calibration software • Is it easy to use? • What are the specific requirements in terms of functionality? • Are there any IT requirements or restrictions for choosing the software? • Does the calibration software need to be integrated with the plant’s existing systems? • Is communication with smart calibrators a requirement? • Does the supplier offer training, implementation, support and upgrades? • Does the calibration software need to be scalable? • Can data be imported to the software from the plant’s current systems? • Does the software offer regulatory compliance? • Supplier’s references and experience as a software developer?

85

why use software for calibration management

SUMMARY Calibration software improves calibration management tasks in all these areas • Planning & decision-making •  Organisation •  Execution •  Documentation •  Analysis

The business benefits of using software for calibration management •  Cost reduction •  Quality improvements •  Increase in efficiency

86

Beamex users Beamex conducted recently a survey of its customers, across all industry sectors. The results showed that 82% of CMX Calibration software customers said that using Beamex products had resulted in cost savings in some part of their operations. 94% of CMX users stated that using Beamex products had improved the efficiency of their calibration processes, whilst 92% said that using CMX had improved the quality of their calibration system. Summary Every type of process plant, regardless of industry sector, can benefit from implementing specialist calibration management software. Compared to traditional, paper-based systems, in-house built legacy calibration systems or calibration modules with CMM systems, using dedicated calibration management software results in improved quality, increased productivity and reduced costs of the entire calibration process. Despite these benefits, only one quarter of companies who need to manage instrument calibrations actually use software designed for that purpose.

87

how often should instruments be calibrated

How often should instruments be calibrated

P

lants can improve their efficiency and reduce costs by performing calibration history trend analysis. By doing it, a plant is able to define which instruments can be calibrated less frequently and which should be calibrated more frequently. Calibration history trend analysis is only possible with calibration software that provides this functionality. Adjusting calibration intervals based on history trend analysis Manufacturing plants need to be absolutely confident that their instrumentation products – temperature sensors, pressure transducers, flow meters and the like – are performing and measuring to specified tolerances. If sensors drift out of their specification range, the consequences can be disastrous for a plant, resulting in costly production downtime, safety issues or possibly leading to batches of inferior quality goods being produced, which then have to be scrapped. Most process manufacturing plants will have some sort of maintenance plan or schedule in place, which ensures that all instruments used across the site are calibrated at the appropriate times. However, with increasing demands and cost issues being placed on manufacturers these days, the time and resources required to carry out these calibration checks are often scarce. This can sometimes lead to instruments being prioritised for calibration, with those deemed critical enough receiving the required regular checks, but for other sensors that are deemed less critical to production, being calibrated less frequently or not at all.

Plants can improve their efficiencies and reduce costs by using calibration ‘history trend analysis’, a function available within Beamex® CMX calibration software.

89

how often should instruments be calibrated

Sensors that are found to be highly stable do not need to be re-calibrated as often as sensors that tend to drift.

90

But plants can improve their efficiencies and reduce costs by using calibration ‘history trend analysis’, a function available within Beamex® CMX calibration software. With this function, the plant can analyze whether it should increase or decrease the calibration frequency for all its instruments. Cost savings can be achieved in several ways. First, by calibrating less frequently where instruments appear to be highly stable according to their calibration history. Second, by calibrating instruments more often when they are located in critical areas of the plant, ensuring that instruments are checked and corrected before they drift out of tolerance. This type of practise is common in companies that employ an effective ‘Preventive Maintenance’ regime. The analyses of historical trends and how a pressure sensor, for example, drifts in and out of tolerance over a given time period, is only possible with calibration software that provides this type of functionality. Current practices in process plants But in reality, how often do process plants actually calibrate their instruments and how does a maintenance manager or engineer know how often to calibrate a particular sensor? In March 2010, Beamex conducted a survey that asked process manufacturing companies how many instruments in their plant required calibrating and the frequency with which these instruments had to be calibrated. The survey covered all industry sectors, including pharmaceuticals, chemicals, power and energy, manufacturing, service, food and beverage, oil and gas, paper and pulp. Interestingly, the survey showed that from all industry sectors, 56% of the respondents said they calibrated their instruments no more than once a year. However, in the pharmaceuticals sector, 59% said they calibrated once a year and 30% said they calibrated twice a year. Perhaps unsurprisingly, due to it being a highly regulated industry, the study proved also that the pharmaceuticals sector typically possesses a significantly higher number of instruments per plant that require calibrating. In addition, these plants also calibrate their instruments more frequently than other industry sectors.

how often should instruments be calibrated

The benefits of analyzing calibration history trends But regardless of the industry sector, by analysing an instrument’s drift over time (ie. the historical trend) companies can reduce costs and improve their efficiencies. Pertti Mäki is Area Sales Manager at Beamex. He specialises in selling the Beamex® CMX to different customers across all industry sectors. He comments: “The largest savings from using the History Trend Option are in the pharmaceuticals sector, without doubt, but all industry sectors can benefit from using the software tool, which helps companies identify the optimal calibration intervals for instruments.” The trick, says Mäki, is determining which sensors should be recalibrated after a few days, weeks, or even years of operation and which can be left for longer periods, without of course sacrificing the quality of the product or process or the safety of the plant and its employees. Doing this, he says, enables maintenance staff to concentrate their efforts only where they are needed, therefore eliminating unnecessary calibration effort and time. But there are other, perhaps less obvious benefits of looking at the historical drift over time of a particular sensor or set of measuring instruments. As Mäki explains: “When an engineer buys a particular sensor, the supplier provides a technical specification that includes details on what the maximum drift of that sensor should be over a given time period. With CMX’s History Trend Option, the engineer can now verify that the sensor he or she has purchased, actually performed within the specified tolerance over a certain time period. If it hasn’t, the engineer now has data to present to the supplier to support his findings.” But that’s not all. The History Trend function also means that a plant can now compare the quality or performance of different sensors from multiple manufacturers in a given location or set of process conditions. This makes it an invaluable tool for maintenance or quality personnel who, in setting up a new process line for example, can use the functionality to compare different sensor types to see which one best suits the new process. Calibration software such as CMX can also help with the planning of calibration operations. Calibration schedules take into account the accuracy required for a particular sensor and the length of time during which it has previously been able to maintain that degree of accuracy. Sensors that are found to be highly stable do not need to be re-calibrated as often as sensors that tend to drift.

The function enables users to plan the optimal calibration intervals for their instruments.

91

how often should instruments be calibrated

History Trend displays the instrument’s drift over a given period both numerically and graphically.

92

The History Trend function enables users to plan the optimal calibration intervals for their instruments. Once implemented, maintenance personnel, for example, can analyze an instrument’s drift over a certain time period. History Trend displays the instrument’s drift over a given period both numerically and graphically. Based on this information, it is then possible to make decisions and conclusions regarding the optimal calibration interval and the quality of the instruments with respect to measurement performance. The ‘History Trend’ window enables users to view key figures of several calibration events simultaneously, allowing to evaluate the calibrations of a position or a device for a longer time period compared to the normal calibration result view. For example, the user can get an overview of how a particular device drifts between calibrations and also whether the drift increases with time. Also, the engineer can analyze how different devices are suited for use in a particular area of the plant or process. Reporting is straightforward and the user can even tailor the reports to suit his or her individual needs, using the ‘Report Design’ tool option.

how often should instruments be calibrated

CALIBRATION HISTORY TREND ANALYSIS Calibration history trend analysis allows you to analyze the instrument’s drift over a certain time period.

The graphical display of the history trend helps in visualizing and optimizing the calibration interval for the instruments.

HISTORY TREND REPORT

HISTORY TREND USER-INTERFACE

• The Beamex® CMX stores every calibration event into the database; the history trend is made automatically without any extra manual work. • The Beamex® CMX also indicates when new devices have been installed and calibrated. This helps in comparing differences between devices. • The graphical display of the history trend helps in visualizing and optimizing the calibration interval for the instruments.

93

how often should instruments be calibrated

SUMMARY

The benefits of calibration history trend analysis: • Analyzing and determining the optimal calibration interval for instruments • Conclusions can be made regarding the quality of a particular measuring instrument • Time savings: faster analyses is possible when compared to traditional, manual methods • Enables engineers to check that the instruments they have purchased for the plant are performing to their technical specifications and are not drifting out of tolerance regularly • Supplier evaluation: the performance and quality of different sensors from different manufacturers can be compared quickly and easily. When calibration frequency can be decreased: • If the instrument has performed to specification and the drift has been insignificant compared to its specified tolerance • If the instrument is deemed to be non-critical or in a low priority location When calibration frequency should be increased: • If the sensor has drifted outside of its specified tolerances during a given time period • If the sensor is located in a critical process or area of the plant and has drifted significantly compared to its specified tolerance over a given time period • When measuring a sensor that is located in an area of the plant that has high economic importance for the plant • Where costly production downtime may occur as a result of a ‘faulty’ sensor • Where a false measurement from a sensor could lead to inferior quality batches or a safety issue

94

how often should instruments be calibrated

ISO 9001:2008 quality management requirements 7.6 Control of monitoring and measuring devices The organization shall determine the monitoring and measurement to be undertaken and the monitoring and measuring devices needed to provide evidence of conformity of product to determined requirements. The organization shall establish processes to ensure that monitoring and measurement can be carried out and are carried out in a manner that is consistent with the monitoring and measurement requirements. Where necessary to ensure valid results, measuring equipment shall a) be calibrated or verified at specified intervals, or prior to use, against measurement standards traceable to international or national measurement standards; where no such standards exist, the basis used for calibration or verification shall be recorded; b) be adjusted or re-adjusted as necessary; c) be identified to enable the calibration status to be determined; d) be safeguarded from adjustments that would invalidate the measurement result; e) be protected from damage and deterioration during handling, maintenance and storage. In addition, the organization shall assess and record the validity of the previous measuring results when the equipment is found not to conform to requirements. The organization shall take appropriate action on the equipment and any product affected. Records of the results of calibration and verification shall be maintained (see 4.2.4). When used in the monitoring and measurement of specified requirements, the ability of computer software to satisfy the intended application shall be confirmed. This shall be undertaken prior to initial use and reconfirmed as necessary.

95

how often should calibrators be calibrated

How often should calibrators be calibrated

A

s a general rule for Beamex’s documenting MC calibrators, starting with a 1-year calibration period is recommended, because the calibrators has a 1-year uncertainty specified. The calibration period can be changed in the future, once you begin receiving cumulated stability history, which is then compared to the uncertainty requirements. In any case, there are many issues to be considered when deciding a calibrator’s calibration period, or the calibration period for any type of measuring device. This article discusses some of the things to be considered when determining the calibration period, and provides some general guidelines for making this decision. The guidelines that apply to a calibrator, also apply to other measuring equipment in the traceability chain. These guidelines can also be used for process instrumentation. An important aspect to consider when maintaining a traceable calibration system is to determine how often the calibration equipment should be recalibrated. International standards (such as ISO9000, ISO10012, ISO17025, CFRs by FDA, GMP, etc.) require the use of documented calibration programs. This means that measuring equipment should be calibrated traceably at appropriate intervals and that the basis for the calibration intervals should be evaluated and documented. When determining an appropriate calibration period for any measuring equipment, there are several things to be considered. They are discussed below.

Uncertainty need is one of the most important things to consider when determining the calibration period.

97

how often should calibrators be calibrated

Uncertainty need One of the first things to evaluate is the uncertainty need of the customer for their particular measurement device. Actually, the initial selection of the measurement device should be also done based on this evaluation. Uncertainty need is one of the most important things to consider when determining the calibration period. Stability history

In critical applications, the costs of an outof-tolerance situation can be extremely high (e.g. pharmaceutical applications) and therefore calibrating the equipment more often is safer.

When the customer has evaluated his/her needs and purchased suitable measuring equipment, (s)he should monitor the stability history of the measuring equipment. The stability history is important criteria when deciding upon any changes in the calibration period. Comparing the stability history of measuring equipment to the specified limits and uncertainty needs provides a feasible tool for evaluating the calibration period. Naturally, calibration management software with the history analysis option is a great help in making this type of analysis. The cost of recalibration vs. consequences of an out-of-tolerance situation Optimizing between recalibration costs and the consequences of an outof-tolerance situation is important. In critical applications, the costs of an out-of-tolerance situation can be extremely high (e.g. pharmaceutical applications) and therefore calibrating the equipment more often is safer. However, in some non-critical applications, where the out-oftolerance consequences are not serious, calibration can be made less frequently. Therefore, evaluating of the consequences of an out-oftolerance situation is something to be considered. The corrective actions in such a case should also be made into an operating procedure. Some measurements in a factory typically have more effect on a product quality than others, and therefore some measurements are more acute than others and should be also calibrated more often than others. Initial calibration period When you purchase calibration equipment with which you are not familiar, you still need to decide the initial calibration period. In this

98

how often should calibrators be calibrated

situation, abiding by the manufacturer’s recommendation is best. For more acute applications, using a shorter calibration period right from the beginning is recommended. Other things to be considered There are also other issues to be considered when determining the calibration period, such as the workload of the equipment, the conditions where the equipment will be used, the amount of transportation and is the equipment look damaged. In some cases, crosschecking with other similar measuring equipment is also feasible for detecting the need for calibration. Crosschecking may be carried out before every measurement in some acute applications. Naturally, only appropriate, metrological, responsible personnel in the company may make changes to the calibration equipment’s calibration period.

In some cases, crosschecking with other similar measuring equipment is also feasible for detecting the need for calibration.

SUMMARY

The main issues to be considered when determining the calibration period for measuring equipment should include at least following: • The uncertainty needs of the measurements to be done. • The stability history of the measuring equipment. • Equipment manufacturer’s recommendations. • The risk and consequences of an out-of-tolerance situation. • Acuteness of the measurements.

99

paperless calibration improves quality and cuts costs

Paperless calibration improves quality and cuts costs

P

aper is part of our everyday lives – whether in the workplace or at home. Take a minute to look around the room you are in and you’ll notice how many objects are made from paper: books, magazines, printer paper, perhaps even a poster on the wall. Global consumption of paper has grown 400% in the last 40 years. Today, almost 4 billion trees or 35% of the total trees cut down across the world are used in paper industries on every continent (source: www. ecology.com). So let’s not add to this already heavy burden on our forests and the environment. As manufacturing companies, our consumption of paper is far higher than it needs to be, especially given that there are technologies, software and electronic devices readily available today which render the use of paper in the workplace unnecessary. Other than helping to save our planet and reducing the number of trees cut down each year, as businesses, there are other, significant benefits in minimising the use of paper. Take the calibration of plant instrumentation devices such as temperature sensors, weighing instruments and pressure transducers. Globally, amongst the process manufacturing industries, calibrating instruments is an enormous task that consumes vast amounts of paperwork. Far too many of these companies still use paper-based calibration systems, which means they are missing out on the benefits of moving towards a paperless calibration system.

Far too many of these companies still use paper-based calibration systems, which means they are missing out on the benefits of moving towards a paperless calibration system.

Traditional paper-based calibration systems Typically, a paper-based calibration system involves the use of handwritten documents. Whilst out in the field, a maintenance or service

101

paperless calibration improves quality and cuts costs

With paperless systems, workflow improves dramatically.

engineer will typically use a pen and paper to record instrument calibration results. On returning to the office, these notes are then tidied up and/or transferred to another paper document, after which they are archived as paper documents. While using a manual, paper-based system requires little or no investment in new technology or IT systems, it is extremely labourintensive and means that historical trend analysis of calibration results becomes very difficult. In addition, accessing calibration data quickly is not easy. Paper systems are time consuming, they soak up lots of company resources and manual (typing) errors are commonplace. Dual effort and the re-keying of calibration data into multiple databases become significant costs to the business. These same companies that use paper-based calibration systems are together generating hundreds of thousands (millions?) of paper calibration certificates each year. However, by utilising the latest software-based calibration management systems from companies like Beamex, these organisations can significantly reduce their paper consumption, whilst also improving quality, workflow and making other significant cost savings for the business. Practical benefits of using less paper Aside from the financial benefits of moving towards a paperless calibration system, there are practical reasons why firms should go paperless. Often, in industrial environments, it is not practicable to store or carry lots of paperwork. After all, every square foot of the business has an associated cost. Furthermore, important paper records could potentially be lost or damaged in an accident or fire. So why would these companies generate and store separate paper copies of important records such as works orders, standard operating procedures (SOPs), blank calibration certificates, etc. when these records can all be combined into a single electronic record? Improved workflow With paper-based systems, paper records that need approval have to be routed to several individuals, which is time-consuming. With paperless systems, workflow improves dramatically. There will be less waiting time, as those individuals who need to sign off records or calibration

102

paperless calibration improves quality and cuts costs

documents can share or access electronic records simultaneously from a central database. The cost and time associated with printing copies of paper documents is also eliminated, as well as the cost of filing and storing those paper records. Just as important, electronic records enable easier analysis of data, particularly calibration results. Historical trending becomes easier, faster and more reliable, which again has cost reduction benefits to the business. Calibration intervals can be optimised. For example, those instruments that are performing better than expected may well justify a reduction in their calibration frequency. When a plant is being audited, calibration software facilitates both the preparation and the audit itself. Locating records and verifying that the system works becomes effortless when compared to traditional paper-based record keeping. Paperless calibration systems improve plant efficiencies because the entire calibration process is now streamlined and automated. Costly production downtime due to unforeseen instrument failures will also be reduced. Data integrity

Paperless calibration systems improve plant efficiencies because the entire calibration process is now streamlined and automated.

The integrity of paper-based calibration systems cannot be relied upon. Paper records may not always reflect the truth. For example, manual errors such as misreadings can occur, particularly when using weighscales or other instruments that are open to an individual’s own interpretation of the data. Sometimes users may inappropriately modify the results data due to work pressures or lack of time/resources. Illegible handwritten notes are also a problem, especially if these paper records need to be typed or transcribed to a computer system or database. Transcription errors such as these can lead to all sorts of problems for a business and can take months to rectify or to identify the rogue data. Business benefits For those more enlightened companies that use software-based calibration systems, the business benefits are significant. The whole calibration process – from initial recording of calibration data through to historical trend analysis – will take less time, whilst mistakes and manual errors will be virtually eliminated. In turn, this means that operators, engineers and management will have more confidence in

103

paperless calibration improves quality and cuts costs

the data, particularly when it comes to plant audits. In addition, this greater confidence in calibration data leads to a better understanding and analysis of business performance and KPIs (particularly if the calibration software is integrated with other business IT systems such as a CMMS) leading to improved processes, increased efficiencies and reduced plant downtime. Commissioning

The calibration data is shared with other business IT systems electronically, resulting in completely paperless, end-to-end workflows.

104

At plant commissioning times, electronic records simplify the handover of plant and equipment. Although handover by commissioning teams that use paper records is straightforward and of universal format, electronic records are easy to manipulate and can be re-used in different IT systems. Electronic data also provides an excellent foundation for ongoing plant operation and maintenance, without needing to collect all the plant data again. How paperless should you go? Of course, in reality, many companies are neither completely paperless nor rely solely on paper-based systems – the process is sometimes a hybrid of the two. A key part of paperless calibration records is the capture of data at point of work, often in difficult industrial environments that would make the use of portable office computers impractical, and the manual entry of calibration results into unintelligent calibration forms on portable industrial computers prone to eye-to-hand data mis-reads and repetitive strain induced error. One way to overcome these error prone data capture methods is to use portable documenting calibrators to measure what can be measured and provide intelligent, technician friendly interfaces on industrialized PDA or tablet based hardware when manual data entry cannot be avoided. The un-editable electronic data stored on high performance multifunction calibrators can be uploaded to calibration management software for safe storage and asset management. Companies can go even further than this and use electronic records for works orders, business management systems, data historians, and for control systems. In other words, the calibration data is shared with other business IT systems electronically, resulting in completely paperless, end-to-end workflows.

paperless calibration improves quality and cuts costs

Suitable hardware Rather than rely on engineers in the field accurately keying in calibration results into suitably robust laptops or PDAs, it is better to source the data electronically using documenting calibrators that are specifically designed for this task. Validation, training & education Paperless systems also need validating in the user’s own environment. Here, Beamex provides comprehensive validation, education and training services for customers. Education and training for users is critical, as this will help companies to overcome the natural resistance to change amongst the workforce, which may be used to dealing with traditional, paper-based systems. Case study Beamex is helping many organisations to implement paperless calibration management systems, including Pharmaceuticals, Chemicals, Power & Energy, Oil Gas & Petrochemicals companies. Amongst these customers is UK firm Croda Chemicals Europe. Based in East Yorkshire near Goole, the Croda plant uses pressurised vessels to purify lanolin for healthcare and beauty products. Each vessel needs to be certified at least once every two years in order to demonstrate that the vessel is safe and structurally sound. This includes a functionality check on all of the pressure instrumentation, as well as the sensors that monitor the incoming chemical additives and the outgoing effluent. Senior Instrument Technician David Wright recalls what it was like to perform all of those calibration operations with paper and pencil during the company’s regularly scheduled maintenance shutdowns: “It took us one week to perform the calibrations and a month to put together the necessary paperwork.” Today, Croda uses the CMX calibration management software system from Beamex, which coordinates data collection tasks and archives the results. “It’s faster, easier and more accurate than our old paper-based procedures,” says Wright. “It’s saving us around 80 manhours per maintenance period and should pay for itself in less than three years.”

Education and training for users is critical, as this will help companies to overcome the natural resistance to change amongst the workforce, which may be used to dealing with traditional, paper-based systems.

105

106

intelligent commissioning

Intelligent commissioning

C

alibration plays a vital role in process plant commissioning and when installing new instruments. This article explains process instrument commissioning and the benefits of calibration during the commissioning phase. What is process instrument commissioning?

Successful commissioning of process instrumentation must be considered within the context of the overall commissioning program.

Successful commissioning of process instrumentation is an essential requirement for ideal plant performance. A plant, or any defined part of a plant, is ready for commissioning when the plant has achieved mechanical completion. Plant commissioning involves activities such as checking to ensure plant construction is complete and complies with the documented design or acceptable (authorized and recorded) design changes. In general, commissioning activities are those associated with preparing or operating the plant or any part of the plant prior to the initial start-up and are frequently undertaken by the owner or joint owner/ contractor team. Commissioning may involve mock operations which are commissioning activities conducted to allow operational testing of the equipment and operator training and familiarization. At the completion of commissioning, the plant will be fully ready for production operation. Energizing power systems, operational testing of plant equipment, calibration of instrumentation, testing of the control systems as well as verification of the operation of all interlocks and other safety systems are also typical commissioning tasks. These activities are usually described as ‘cold commissioning’. Pre-commissioning activities are those which have to be undertaken

107

intelligent commissioning

There are many reasons why instruments should be calibrated during the commissioning phase before start-up.

prior to operating equipment, such as adjustments and checks on machinery performed by the construction contractor prior to commissioning and without which the installation cannot be said to be mechanically complete. Mechanical completion of a plant or any part of a plant occurs when the plant or a part of the plant has been completed in accordance with the drawings and specifications, and the re-commissioning activities have been completed to the extent where the owner approves the plant and can begin commissioning activities. Commissioning requires a team of people with a background in plant design, plant operation and plant maintenance. Some companies employ specialized commissioning engineers. This can prove to be a worthwhile investment for large plants because it allows for dedicated responsibility and focus in operations and significant improvements to schedules, and adverse incidents at the start-up phase can be avoided. An extra day taken for commissioning means the same to the plant owner as an extra day taken during designing or construction; in fact, it may cost more, as the plant owner’s commitments in terms of product marketing and operational costs are likely to be higher. Management, personnel and cost of commissioning Since commissioning takes place toward the end of the project, there is a risk that the work may be under-resourced, because the funds have been allotted to cover budget overruns. It is essential to comprehend the scope and length of commissioning activities and include them in the initial project plan and budget allocations, and ensure this commitment is maintained. The cost of process instrument commissioning is typically affected by the following issues: learning and familiarizing with the field device, physically installing the field device, connecting to and identifying the field device, configuring the required parameters and testing the configuration and interface to other systems. Basically, these steps must be repeated with every field device that will be installed at the plant. As there are many cost factors in the commissioning process, detailed planning of commissioning and plant handover are essential elements of the overall project plan and schedule as any other grouping of activities. Each of the commissioning activities must be broken down into a number of manageable tasks, and a schedule needs to be established for each task including benchmarks for monitoring purposes. The rate

108

intelligent commissioning

Construction Pre-commissioning Mechanical completion Commissioning Trial operation Initial start-up Examine product specification Examine production performance Acceptance of plant

Sequence of activities leading to commissioning and acceptance of a plant.

The calibration database can be calibration software designed specifically for managing calibration assets and information, such as the Beamex® CMX Calibration Software.

of commissioning is measurable (e.g. number of loops or sequence of steps tested per day), thereby enabling progress to be reviewed regularly. Successful commissioning of process instrumentation must be considered within the context of the overall commissioning program. Good planning, coordination, communications, documentation, teamwork and training are all essential. The commissioning team consists of a mixture of specialists, instrument and process engineers, and the size of the team and composition of specialists depends on the nature and scope of the system. Calibration and the commissioning of field instrumentation New process instrumentation is typically configured and calibrated by the manufacturer prior to installation. However, instruments are often recalibrated upon arrival at the site, especially if there has been obvious

109

intelligent commissioning

damage in transit or storage. There are also many other reasons why instruments should be calibrated during the commissioning phase before start-up. Assuring transmitter quality First of all, the fact that an instrument or transmitter is new does not automatically mean that it is within required specifications. Calibrating a new instrument before installing or using it is a quality assurance task. You can check the overall quality of the instrument to see if it is defective and to ensure it has the correct, specified settings. Reconfiguring a transmitter

The trick is determining which sensors should be recalibrated after a few hours, weeks, or years of operation and which can be left as is for longer periods without sacrificing quality or safety.

The new uninstalled instrument or transmitter may have the correct, specified settings. However, it is possible that the original planned settings are not valid anymore and they need to be changed. By calibrating an instrument you can check the settings of the instrument. After you have performed this task, it is possible to reconfigure the transmitter, when the initial planned specifications have been changed. Calibration is therefore a key element in the process of reconfiguring an uninstalled transmitter. Monitoring the quality and stability of a transmitter When calibration procedures are performed for an uninstalled instrument, the calibration serves also future purposes. By calibrating the transmitter before installation and on a regular basis thereafter, it is possible to monitor the stability of the transmitter. Entering the necessary transmitter data into a calibration database By calibrating an instrument before installation it is possible to enter all the necessary instrument data into the calibration database, as well as to monitor the instrument’s stability, as was explained in the previous paragraph. The calibration database can be calibration software designed specifically for managing calibration assets and information, such as the Beamex® CMX Calibration Software. The transmitter information is critical in defining the quality of the instrument and for planning the optimal calibration interval of the instrument. Transmitters that are found to be highly stable need not be recalibrated as often as transmitters that tend to drift. The trick is determining which sensors should be recalibrated after a few hours, weeks, or years of operation and which can be left as is for

110

intelligent commissioning

longer periods without sacrificing quality or safety. Doing so allows maintenance personnel to concentrate their efforts only where needed, thereby eliminating unnecessary calibration work. Therefore, entering the instrument data into a calibration management system is part of the calibration procedures performed on an instrument before it is installed and in use. Integrated calibration solution by Beamex The Beamex® Integrated Calibration Solution, consisting of calibration software and documenting calibration equipment, improves the quality and efficiency of the entire calibration system through faster, smarter and more accurate management of all calibration assets and procedures. The Beamex® MC series documenting calibrators can be used for calibrating pressure, temperature, electrical and frequency signals. The Beamex calibrators support various different transmitter protocols, such as analog, HART, Foundation Fieldbus and Profibus. The Beamex calibrators are all-in-one calibrators, which mean that they can be used to replace several individual measurement devices. Intrinsically safe calibrators for potentially explosive environments are also available. The Beamex® CMX Calibration Software can be used for improving the quality, productivity and cost-effectiveness of a plant’s calibration process. The Beamex® CMX can be used for planning and scheduling calibrations, managing and storing all calibration data as well as analyzing and optimizing the calibration interval. Using the CMX gives always a clear status of the transmitters; for instance, are they installed and ready for calibration, does anyone perform the calibration (check in/out function) and what is the instrument/position status (pass/fail). Having a fully integrated calibration management system – using documenting calibrators and calibration management software – is important. Beamex® CMX Calibration Software ensures that calibration procedures are carried out at the correct time and that calibration tasks do not get forgotten, overlooked or become overdue. By using a documenting calibrator, the calibration results are stored automatically in the calibrator’s memory during the calibration process. Engineers performing calibrations no longer have to write down any results on paper, making the entire process much quicker and reducing costs. All calibration documentation is

Beamex® CMX Calibration Software ensures that calibration procedures are carried out at the correct time and that calibration tasks do not get forgotten, overlooked or become overdue.

111

intelligent commissioning

By using a documenting calibrator, the calibration results are stored automatically in the calibrator’s memory during the calibration process.

therefore automatically produced when using the Beamex® Integrated Calibration Solution. The quality and accuracy of calibration results also improve, as there are fewer mistakes due to human error. The calibration results are transferred automatically from the calibrator’s memory to the computer/ database. This means that engineers do not spend their time transferring the results from their notepad to final storage on a computer; again, saving time and money. Major time-savings can also be achieved by using Beamex’s documenting MC calibrators HART and/or Fieldbus functionality to enter transmitter data into the calibrators’ memory where the data can be populated to the CMX Calibration Software, instead of typing the data manually into the calibration database.

SUMMARY Calibration is beneficial during process plant commissioning for various different reasons: • Transmitter quality assurance • Reconfiguring a transmitter • Monitoring the quality and stability of a transmitter •  Entering the necessary transmitter data into a calibration database and defining the optimal calibration interval

112

113

114

successfully executing a system integration project

Successfully executing a system integration project

F

or process manufacturers today, having a reliable, seamlessly integrated set of IT systems across the plant, or across multiple sites, is critical to business efficiency, profitability and growth. Maintaining plant assets – whether that includes production line equipment, boilers, furnaces, special purpose machines, conveyor systems or hydraulic pumps – is equally critical for these companies. Maintenance management has become an issue which deserves enterprise-wide and perhaps multi-site attention, especially if the company is part of an asset-intensive industry, where equipment and plant infrastructure is large, complex and expensive. If stoppages to production lines due to equipment breakdowns are costly, implementing the latest computerized maintenance management systems (CMMS) might save precious time and money. In the process industries, a small, but critical part of a company’s asset management strategy should be the calibration of process instrumentation. Manufacturing plants need to be sure that their instrumentation products – temperature sensors, pressure transducers, flow meters and the like – are performing and measuring to specified tolerances. If sensors drift out of their specification range, the consequences can be disastrous, perhaps resulting in costly production downtime, safety issues or batches of inferior quality goods being produced, which then have to be scrapped. For this, Beamex’s calibration management software, Beamex® CMX, has proved itself time and time again across many industry sectors, including pharmaceuticals, chemicals, nuclear, metal processing, paper, oil and gas.

In the process industries, a small, but critical part of a company’s asset management strategy should be the calibration of process instrumentation.

115

successfully executing a system integration project

Seamless communication

Beamex® CMX Professional or Beamex® CMX Enterprise software can easily be integrated to CMM systems, whether it is a Maximo, SAP or Datastream CMM system or even a company’s own, in-house software for maintenance management.

Today, most process manufacturers use some sort of computerized maintenance management system (CMMS) that sits alongside their calibration management system. Beamex® CMX Professional or Beamex® CMX Enterprise software can easily be integrated to CMM systems, whether it is a Maximo, SAP or Datastream CMM system or even a company’s own, in-house software for maintenance management. Beamex® CMX helps companies document, schedule, plan, analyze and optimize their calibration work. Seamless communication between CMX and ‘smart’ calibrators means that companies have the ability to automate predefined calibration procedures. As well as retrieving and storing calibration data, CMX can also download detailed instructions for operation before and after calibrating, like procedures, reminders and safety-related information. Seamless communication with calibrators also provides many practical benefits such as a reduction in paperwork, elimination of human error associated with manual recording, and the ability to speed up the calibration task. CMX also stores the complete calibration history of process instruments and produces fully traceable calibration records. Integrating CMX with a CMM system means that plant hierarchy and all work orders for process instruments can be generated and maintained in the customer’s CMM system. Calibration work orders can easily be transferred to CMX Calibration Software. Then, once the calibration work order has been executed, CMX sends an acknowledgement order of this work back to the customer’s CMM system. All detailed calibration results are stored and available on the CMX database. Integration project A customer may have a large CMM system and a considerable amount of data keying to perform before integration is complete. A data exchange module or interface that sits between the two systems is required. The integration project involves three main parties: Beamex, the customer and the CMM system software partner.

116

successfully executing a system integration project

Project organization and resourcing In order to have a successful integration, it’s important that the right people and decision-makers are involved and participate right from the beginning of the project. It’s also essential that the main roles and responsibilities of the parties are specified before the project evolves. Moreover, a project organization should be established and include members from both the supplier’s and the customer’s organization, as a successful project requires input from both parties. The role of each member should be defined and project managers appointed. The project manager is usually responsible for the operative management of the project. In addition, a project steering group may need to be established. The project steering group is responsible for making key decisions during the project. The role, tasks and authority of the project steering group must be defined as well as the decision-making procedures. Project phases The integration project is divided into four main phases:

The integration project involves three main parties: Beamex, the customer and the CMM system software partner.

1. Scope of Work 2. Development and Implementation 3. Testing 4. Installation, Verification and Training The four main phases are also often divided into sub-phases. A schedule is usually defined for the completion of the entire project as well as for the completion of each project phase. Each project phase should be approved according to the acceptance procedures defined in the offer, agreement, project plan or other document annexed to the offer / agreement.

117

successfully executing a system integration project

Scope of work To ensure successful integration with a satisfied customer, defining the correct scope of work (SOW) is crucial. The scope of work should include a brief project description, services provided, main roles, partner responsibilities and the desired outcome. The scope of work is important to make sure that both the supplier and the customer have understood the project in question and they have similar expectations from it. The SOW is often developed through pre-studies and workshops. Defining what is not included in the scope of work is just as important as defining what is included in it. This means that establishing some framework and limitations for the project are also very important, as the resourcing, scheduling and costs of the project depend greatly on the scope of work. If the scope of work is not defined carefully, questions or problems may appear later in the project, which will direct the project back to phase one where a review of the scope is necessary. This is an urgent but time-consuming matter and can be avoided if the right people and decision-makers participate in the first project phase. However, as changes to the original scope of work may be necessary and required even in projects where the SOW phase has been done carefully, it is important that the supplier and customer agree on change management procedures as early as the starting phase of the project.

118

successfully executing a system integration project

Development and implementation When the scope of work has been defined and approved by both parties, the integration can enter the next phase, which is the actual development and implementation of the project deliverables. Testing Testing occurs both during the project after each partial delivery, in order to be able to continue the development work to next phase, and at the final stage of the project. The testing, approval procedures and timelines should be defined when agreeing on the project. Installation, verification and training The final stage in the integration process is the installation and testing at the customer’s facility and taking the system into production use. The project manager at the buyer’s facility now plays a major role in the success of the integration process. The supplier will, if required and agreed, assist with informing, training and providing training materials. When the integration is finished, the customer has a system that saves time, reduces costs and increases productivity by preventing unnecessary double effort and re-keying of procedures in separate

INTEGRATION PROJECT PHASES

• Purpose / needs • Target • Supplier’s responsibilities • Customer’s responsibilities • Project management and project steering group

• Change management

• Testing and acceptance procedures

Scope of work (SOW)

Specifications documentation

Development and implementation

Implementation documentation

Testing

Testing documentation

Installation Verification Training

Instructional documentation

When the integration is finished, the customer has a system that saves time, reduces costs and increases productivity by preventing unnecessary double effort and rekeying of procedures in separate systems.

• Final approval by customer

FOLLOW UP CLOSURE OF INTEGRATION PROJECT

119

successfully executing a system integration project

Integrating a CMM system with calibration management software is an important step in the right direction when it comes to EAM, Enterprise Asset Management.

120

systems. When there is no need to manually re-key the data, typing errors are eliminated. A CMMS integration will enable the customer company to automate its’ management with smart calibrators. This improves the quality of the entire system. Integrating a CMM system with calibration management software is an important step in the right direction when it comes to EAM, Enterprise Asset Management. However, EAM is more than just maintenance management software. It’s about companies taking a business-wide view of all their plant equipment and coordinating maintenance activities and resources with other departments and sites, particularly with production teams. Savings from EAM are reasonably well-documented and come in various guises, the most common benefits being: less equipment breakdowns (leading to a reduction in overall plant downtime); a corresponding increase in asset utilization or plant uptime; better management of spare parts and equipment stocks; more efficient use of maintenance staff; and optimized scheduling of maintenance tasks and resources. But the key to success is really the quality of information you put in the software, the data has to be as close to 100% accurate as possible to get maximum benefit from the system.

121

Calibration in Industrial Applications

124

the benefits of using a documenting calibrator

The benefits of using a documenting calibrator

F 

or process manufacturers, regular calibration of instruments throughout a manufacturing plant is common practice. In plant areas where instrument accuracy is critical to ensure product quality, safety or custody transfer, calibration every six months – or even more frequently – is not unusual. However, the key final step in any calibration process – documentation – is often neglected or overlooked because of a lack of resources, time constraints or the pressure of everyday activities. Indeed, many process plants are under pressure to calibrate instruments quickly but accurately and to ensure that the results are then documented for quality assurance purposes and to provide full traceability. The purpose of calibration itself is to determine how accurate an instrument or sensor is. Although most instruments are very accurate these days, regulatory bodies often need to know just how inaccurate a particular instrument is and whether it drifts in and out of a specified tolerance over time.

Many process plants are under pressure to calibrate instruments quickly but accurately and to ensure that the results are then documented for quality assurance purposes and to provide full traceability.

What is a documenting calibrator? A documenting calibrator is a handheld electronic communication device that is capable of calibrating many different process signals such as pressure, temperature and electrical signals, including frequency and pulses, and then automatically documenting the calibration results by transferring them to a fully integrated calibration management software. Some calibrators can read HART, Foundation Fieldbus or Profibus output of the transmitters and can even be used for configuring ‘smart’ sensors.

125

the benefits of using a documenting calibrator

Heikki Laurila, Product Manager at Beamex in Finland comments, “I would define a documenting calibrator as a device that has the dual functionality of being able to save and store calibration results in its memory, but which also integrates and automatically transfers this information to some sort of calibration management software.” A non-documenting calibrator is a device that does not store data, or stores calibration data from instruments but is not integrated to a calibration management system. Calibration results have to be keyed manually into a separate database, spreadsheet or paper filling system. Why use a documenting calibrator?

The engineer does not have to write any results down on paper, which makes the entire process much faster and consequently reduces costs.

126

By using a documenting calibrator, the calibration results are stored automatically in the calibrator’s memory during the calibration process. The engineer does not have to write any results down on paper, which makes the entire process much faster and consequently reduces costs. The quality and accuracy of calibration results will also improve, as there will be fewer mistakes due to human error. The calibration results are automatically transferred from the calibrator’s memory to the computer/database. This means the engineer does not have to spend time transferring the results from his notepad to final storage on a computer; again, saving time and money. With instrument calibration, the calibration procedure itself is critical. Performing the calibration procedure in the same way each time is important for the consistency of results. With a documenting calibrator, the calibration procedure can be automatically transferred from the computer to the handheld calibrator before going out into the field. As Laurila states, “Engineers, who are out in the field performing instrument calibrations, receive instant pass or fail messages with a documenting calibrator. The tolerances and limits for a sensor, as well as detailed instructions on how to calibrate the transmitter, are entered once into the calibration management software and then downloaded to the calibrator. This means calibrations are carried out in the same way every time because the calibrator tells the engineer which test point he needs to measure next. Also, having an easy-touse documenting calibrator is definitely the way forward, especially if calibration is one of the many tasks that the user has to carry out in his daily maintenance routine.” With a multi-functioning documenting calibrator, such as the

the benefits of using a documenting calibrator

Beamex® MC5 or MC6, the user doesn’t need to carry as much equipment while out in the field. Both calibrators can be used also to calibrate, configure and trim HART, Foundation Fieldbus H1 or Profibus PA transmitters. Laurila continues, “With a documenting calibrator, such as the MC5 or the MC6, the user can download calibration instructions for hundreds of different instruments into the device’s memory before going out into the field. The corresponding calibration results for these instruments can be saved in the device without the user having to return to his PC in the office to download/upload data. This means the user can work in the field for several days.” Having a fully integrated calibration management system – using documenting calibrators and calibration management software – is important. Beamex® CMX Calibration Software ensures that calibration procedures are carried out at the correct time and that calibration tasks are not forgotten, overlooked or overdue. Benefits in practice Conventional calibration work relies on manual, paper-based systems for documenting. Manual calibration takes more time and is more prone to error. Oftentimes, the field engineer calibrates the instrument, handwrites the results onto a paper form and then reenters this information into a database when he returns to the office. Unintentional errors often occur and the whole process is timeconsuming. Using Beamex® CMX Calibration Software and the documenting Beamex® MC6 or MC5 Multifunction Calibrators provides full control of the entire calibration process and reduces costs by up to 50 % .* Why? Because the devices provide higher accuracy, the calibration process is much faster, and the system provides full traceability. When you’ve got to calibrate instruments throughout a site, typically with five-point checks on each instrument, speed and accuracy are critical. Using the MC6 or MC5 with CMX software means that calibration instructions for an instrument and calibration orders are downloaded to the calibrators and ready to guide the engineer in the field with correct calibration procedures.

Calibration software ensures that calibration procedures are carried out at the correct time and that calibration tasks are not forgotten, overlooked or overdue.

___________________ * Reported to the Industrial Instrumentation and Controls Technology Alliance and presented at the TAMU ISA Symposium, January, 2004

127

the benefits of using a documenting calibrator

After completing instrument calibrations, the system provides a full quality assurance report of all instruments calibrated along with a required calibration certificate. This not only ensures full traceability but also reflects full and traceable documentation of the completed work.

SUMMARY The benefits of using a documenting calibrator

Calibration results are automatically transferred from the calibrator’s memory to a computer or fully integrated calibration management system.

128

• Calibration results are automatically stored in the calibrator’s on-board memory during the calibration procedure. • Calibration results are automatically transferred from the calibrator’s memory to a computer or fully integrated calibration management system. • Less paperwork and fewer manual errors. • Reduced costs from a faster and more efficient calibration process. • Improved accuracy, consistency and quality of calibration results. • A fully traceable calibration system for the entire plant. • The calibration procedure itself is guided by the calibrator, which uploads detailed instructions from the computer or calibration management software. • No manual printing or reading of calibration instructions is required; again, saving time and money and simplifying the process.

129

calibration of weighing instruments  part 1

Calibration of weighing instruments Part 1

F

rom the point of view of the owner, weighing instruments, usually called scales or balances, should provide the correct weighing results. How the weighing instrument is used and how reliable the weighing results are can be very different. Using weighing instruments for legal purposes must have legal verification. If a weighing instrument is used in a quality system, the user must define the measurement capability of it. In any case, it is the owner or the user of the instrument that carries the final responsibility of measurement capability and who is also responsible for the processes involved. (S)He must select the weighing instrument and maintenance procedure to be used to reach the required measurement capability. From a regulatory point of view, the quality of a weighing instrument is already defined in OIML regulations, at least in Europe. Calibration is a means for the user to obtain evidence of the quality of weighing results, and the user must have the knowledge to apply the information achieved through calibration.

In any case, it is the owner or the user of the instrument that carries the final responsibility of measurement capability and who is also responsible for the processes involved.

Calibration and legal verification Weighing instruments may also possess special features. One of these features includes making measurements for which legal verification is required, for example when invoicing is based on the weight of a solid material. The features may vary slightly from country to country, but in the EU they are the same, at least at the stage when the weighing instrument is being introduced into use. Verification and calibration abide by a different philosophy. Calibration depicts the deviation between indication and reference (standard) including tolerance, whereas verification depicts the

131

calibration of weighing instruments  part 1

maximum permissible amount of errors of the indication. This is a feasible practice for all weighing. The practical work for both methods is very similar and both methods can be used to confirm measurement capability, as long as legal verification is not needed. The terminology and practices used previously for verifying measurement capability, and for weighing technology in general, are based on these practices of calibrating and verifying, even if it was a question of general weighing (non-legal). Confirmation is the collecting of information

We must remember that the quality of the evaluation of measuring tolerance depends on the collected information through calibration.

132

Confirming the capability of weighing instruments should happen by estimating the quality of the measuring device in the place where it will be used. In practice, this means investigating the efficiency of the weighing instrument; this operation is known as calibration (or verification). One calibration provides information on a temporary basis and a series of calibrations provides time-dependent information. The method of calibration should be selected such that it provides sufficient information for evaluating the required measuring tolerance. The method should be precise for achieving comparable results during all calibrations. Comparing the indication of weighing instruments with a set standard gives the deviation or error. However, to be able to define the measuring tolerance, we need more information about the weighing instrument, such as repeatability, eccentric load, hysteresis, etc. We must remember that the quality of the evaluation of measuring tolerance depends on the collected information through calibration. Using a calibration program, which goes through the same steps for every calibration – calculates deviation and measuring tolerance, and, if necessary, produces a calibration certificate – is the best way to achieve reliable information to use in comparisons. This type of program is able to store all the history of calibrated weighing instruments, including information for other measuring devices. It is also handy for monitoring measuring systems. The most important aspect of a calibration program is that it allows the user to select the calibration method that corresponds to the required level of measuring tolerance, and it displays the history of calibrations and in this way provides the user with comprehensive information concerning measuring capability.

calibration of weighing instruments  part 1

The purpose of calibration and complete confirmation Calibration is a process where the user is able to confirm the correct function of the weighing instrument based on selected information. The user must define the limits for permitted deviation from a true value and required measuring tolerance. If these values are exceeded, an adjustment or maintenance is necessary. Calibration itself, however, is a short-term process; the idea is that the weighing instrument remains in good working condition until the next calibration. For this reason, the user must determine all of the external factors which may influence the proper functioning of the weighing instrument. The factors in question may include the effect of the environment where the weighing instrument is used and how often the instrument needs to be cleaned, regular monitoring of the zero point and the indication number with a constant mass. Today, the function of weighing instruments, as well as many other instruments, is based on microprocessors. They possess several possibilities for adjusting parameters in measuring procedures. Calibration should be carried out using settings based on the parameters for normal use. It is very important that the users of the weighing instruments, as well as calibration personnel, are familiar with these parameters and use them as protocol. Since there are several parameters in use, it is important to always have the manual for using the weighing instrument easily available to the user.

Calibration itself, however, is a short-term process; the idea is that the weighing instrument remains in good working condition until the next calibration.

The content of the calibration certificate Very often the calibration certificate is put on file as evidence of a performed calibration to await the auditing of the quality system. However, a quality system is usually concerned with the traceability of measurements and the known measuring tolerance of the measurements made. The calibration certificate of a single measuring device is used as a tool for evaluating the process of measuring tolerance and for displaying the traceability of the device in question. Performing calibrations based on the measuring tolerance is better than doing routine measuring. Therefore, the user must evaluate the process of measuring tolerance and compare this value with the required measuring tolerance of the process.

133

calibration of weighing instruments  part 1

SUMMARY Calibration (or verification) is a fundamental tool for maintaining a measuring system. It also assists the user in obtaining the required quality of measurements in a process. The following must be taken into consideration: • the type of procedure to be applied in confirming measuring tolerance • the interpretation of information while abiding by the calibration certificate • changing procedures based on received information Quality calibration methods and data handling systems offer state-of-the-art possibilities to any company.

134

calibration of weighing instruments  part 1

135

calibration of weighing instruments  part 11

Calibration of weighing instruments Part 2

W

eighing is a common form of measurement in commerce, industries and households. Weighing instruments are often highly accurate, but users, i.e. their customers and/or regulatory bodies, often need to know just how inaccurate a particular scale may be. Originally, this information was obtained by classifying and verifying the equipment for type approval. Subsequently, the equipment was tested or calibrated on a regular basis. Typical calibration procedures Calibrating scales involves several different procedures depending on national- and/or industry-specific guidelines or regulations, or on the potential consequences of erroneous weighing results. One clear and thorough guide is the EA-10/18, Guidelines on the Calibration of Non-automatic Weighing Instruments, which was prepared by the European Co-operation for Accreditation, and published by the European Collaboration in Measurement and Standards (euromet). Typical scale calibration involves weighing various standard weights in three separate tests: •  repeatability test •  eccentricity test •  weighing test (test for errors of indication) In the pharmaceutical industry in the United States, tests for determining minimum weighing capability are also performed.

Weighing instruments are often highly accurate, but users, i.e. their customers and/or regulatory bodies, often need to know just how inaccurate a particular scale may be.

Repeated weighing measurements provide different indications Usually, the object being weighed is placed on the load receptor and the weighing result is read only once. If you weigh the object 137

calibration of weighing instruments  part 11

repeatedly, you will notice slight, random variation in the indications. The repeatability test involves weighing an object several times to determine the repeatability of the scale used. Center of gravity matters

Weighing instruments are often highly accurate, but users, i.e. their customers and/or regulatory bodies, often need to know just how inaccurate a particular scale may be.

The eccentricity test involves placing the object being weighed in the middle of the load receptor as accurately as possible. This is sometimes difficult due to the shape or construction of the object being weighed. Typical calibration procedures include the eccentricity test. You can determine how much the eccentricity of the load will affect the indication on the scale by weighing the same weight at the corners of the load receptor. Test for errors in indication The weighing test examines the error of the indication on the scale for several predefined loads. This enables you to correct the errors and definitions for non-linearity and hysteresis. If the scale’s maximum load limit is extremely large, it may be impractical to use standard weights for calibrating the entire range. In such a case, suitable substitution mass is used instead. Substitution mass should also be used if the construction of the scale does not allow the use of standard weights. A truck scale is unsuitable for weighing letters The purpose of the minimum weight test is to determine the minimum weight, which can be assuredly and accurately measured using the scale in question. This condition is met if the measurement error is less than 0.1% of the weight, with a probability of 99.73%. Combined standard uncertainty of the error U(E) Knowing the error of the scale indication at the point of each calibration is not sufficient. You must also know how certain you can be about the error found at each point of calibration. There are several sources of uncertainty of the error, e.g.:

138

calibration of weighing instruments  part 11

• The masses of the weights are only known with a certain uncertainty. • Air convection causes extra force on the load receptor. • Air buoyancy around the weights varies according to barometric pressure, air temperature and humidity. • A substitute load is used in calibrating the scale. • Digital scale indications are rounded to the resolution in use. • Analogous scales have limited readability. • There are random variations in the indications as can be seen in the Repeatability Test. • The weights are not in the exact middle of the load receptor. The values of uncertainty determined at each point of calibration are expressed as standard uncertainties (coverage probability: 68.27%), which correspond to one standard deviation of a normally distributed variable. The combined standard uncertainty of the error at a certain point of calibration has a coverage probability of 68.27% as well.

1.8 g 0.4 g

3.2 g

1.1 g

3.9 g 2.5 g ±u(E) 68.27% U(E) = 2u(E) 95.45 % U(E) = 3u(E) 99.73 %

4.6 g

Example: The calibration error and its uncertainty at the calibration point of 10 kg may be expressed e.g. E = 2.5 g and u(E) = ±0.7 g, which means that the calculated error in the indication is 2.5 g and the actual error, with a coverage probability of 68.27%, is between is between 1.8 g and 3.2 g.

139

calibration of weighing instruments  part 11

Expanded uncertainty in calibration U(E)

The purpose of calibration is to determine how accurate a weighing instrument is.

In practice, a coverage probability of 68.27% is insufficient. Normally, it is extended to a level of 95.45% by multiplying it with the coverage factor k = 2. If the distribution of the indicated error cannot be considered normal, or the reliability of the standard uncertainty value is insufficient, then a larger value should be used for the k-factor. If you are able to use the k = 2 coverage factor, then the error and its extended uncertainty at the point of calibration are E = 2.5 g and U(E) = ±1.4 g. This means that the calculated error of the indication is 2.5 g and the actual error, with a coverage probability of 95.45%, is between 1.1 g and 3.9 g. Uncertainty of a weighing result The purpose of calibration is to determine how accurate a weighing instrument is. As the above-mentioned case indicates, you know that if you repeat the calibration several times, the indication of weighing an object of 10 kilograms will be between 10.0011 kg and 10.0039 kg 95.45% of the time. However, the uncertainty of the results of later routine weighings is usually larger. Typical reasons for this are: • Routine weighing measurements involve random loads, while calibration is made at certain calibration points. • Routine weighing measurements are not repeated whereas indications received through calibrations may be averages of repeated weighing measurements. • Finer resolution is often used in calibration. • Loading/unloading cycles in calibration and routine weighing may be different. • A load may be situated eccentrically in routine weighing. • Tare balancing device may be used in routine weighing. • The temperature, barometric pressure and relative humidity of the air may vary. • The adjustment of the weighing instrument may have changed. Standard and expanded uncertainties of weighing results are calculated using technical data of the weighing instrument, its calibration results, knowledge of its typical behaviour and knowledge of the conditions of the location where the instrument is used. Defining the uncertainty of weighing results is highly recommended, at least once, for all

140

calibration of weighing instruments  part 11

typical applications and always for critical applications. Calculating the uncertainty of weighing results assists you in deciding whether or not the accuracy of the weighing instrument is sufficient and how often it should be calibrated. However, determining the uncertainty of weighing results is not part of calibration. Calibrating and testing weighing instruments using CMX CMX’s scale calibration enables you to uniquely configure calibration and test each weighing instrument. Correspondingly, copying configurations from one scale to another is easy. Error limits can be set according to OIML or Handbook-44. Wide variation in user-specific limits is also possible. CMX calculates combined standard uncertainty and expanded uncertainty at calibration of the weighing instrument. It allows you to enter additional, user-defined uncertainty components in addition to supported uncertainty components. CMX’s versatile calibration certificate and possibility to define a user specific certificate assure that you can fulfill requirements set for your calibration certificates.

CMX’s scale calibration enables you to uniquely configure calibration and test each weighing instrument.

141

142

calibrating temperature instruments

Calibrating temperature instruments

T

he most commonly and most frequently measurable variable in industry is temperature. Temperature greatly influences many physical features of matter, and its influence on e.g. quality, energy consumption and environmental emission is significant. Temperature, being a state of equilibrium, makes it different from other quantities. A temperature measurement consists of several time constants and it is crucial to wait until thermal equilibrium is reached before measuring. Metrology contains mathematic formulas for calculating uncertainty. The polynoms are specified in ITS 90 table (International Temperature Scale of 1990). For each measurement, a model that includes all influencing factors must be created. Every temperature measurement is different, which makes the temperature calibration process slow and expensive. While standards determine accuracy to which manufacturers must comply, they nevertheless do not determine the permanency of accuracy. Therefore, the user must be sure to verify the permanency of accuracy. If temperature is a significant measurable variable from the point of view of the process, it is necessary to calibrate the instrument and the temperature sensor. It is important to keep in mind an old saying: all meters, including sensors, show incorrectly, calibration will prove by how much.

While standards determine accuracy to which manufacturers must comply, they nevertheless do not determine the permanency of accuracy.

Temperature sensors The most commonly used sensors in the industry used for measuring temperature are temperature sensors. They either convert temperature into resistance (Resistance Temperature Detectors, RTD) or convert temperature into low voltage (Thermocouples, T/C). RTD’s are based

143

calibrating temperature instruments

on the fact that the resistance changes with temperature. Pt100 is a common RTD type made of platinum and its resistance in 0 ˚C (32 ˚F) is 100∧. Thermocouple consists of two different metal wires connected together. If the connections (hot junction and cold junction) are at different temperatures, a small temperature dependent voltage difference/current can be detected. This means that the thermocouple is not measuring the temperature, but the difference in temperature. The most common T/C type is the K-type (NiCr/NiAl). Despite their lower sensitivity (low Seebeck coefficient), the noble thermo-elements S-, R- or B-type (PtRh/Pt, PtRh/Pt/Rh) are used especially in high temperatures for better accuracy and stability.

The most important criterion in the calibration of temperature sensors is how accurate the sensors are at the same temperature.

Temperature transmitters The signal from the temperature sensor cannot be transmitted a longer distance than the plant. Therefore, temperature transmitters were developed to convert the sensor signal into a format that can be transmitted easier. Most commonly, the transmitter converts the signal from the temperature sensor into a standard ranging between 4 and 20 mA. Nowadays, transmitters with a digital output signal, such as Fieldbus transmitters, are also being adopted, while the transmitter converts the sensor signal, it also has an impact on the total accuracy, and therefore the transmitter must be calibrated on regular basis. A temperature transmitter can be calibrated using a temperature calibrator. Calibrating temperature instruments To calibrate a temperature sensor, it must be inserted into a known temperature. Sensors are calibrated either by using temperature dry blocks for industrial field or liquid baths (laboratory). To make comparisons, we compare the sensor to be calibrated and the reference sensor. The most important criterion in the calibration of temperature sensors is how accurate the sensors are at the same temperature. The heat source may also have an internal temperature measurement that can be used as reference, but to achieve better accuracy and reliability, an external reference temperature sensor is recommended. The uncertainty of calibration is not the same as the accuracy of the device. Many factors influence the total uncertainty, and performing calibration is not the least influencing factor. All heat

144

calibrating temperature instruments

sources show measurement errors due to their mechanical design and thermodynamic properties. These effects can be quantified to determine the heat source’s contribution to the measurement uncertainty. The major sources of measurement uncertainty are axial homogeneity, radial homogeneity, loading effect, stability and immersion depth. Guidelines for minimizing measurement uncertainty should be applied according to Euramet/cg-13/v.01 (former EA-10/13).

Measurement uncertainty Axial homogeneity Axial homogeneity is the temperature distribution in the measurement zone along the boring (axial temperature distribution). Radial homogeneity Radial homogeneity can be explained as the difference in temperature occurring between the borings.

The uncertainty of calibration is not the same as the accuracy of the device.

Loading effect When several sensors are placed in the borings of the heat source, they will affect accuracy. This phenomenon is called loading effect. Stability Stability means variation of the temperature in the measurement zone over time when the system has reached equilibrium. Thirty minutes is commonly used. Immersion depth To achieve a more stable calibration, the immersion depth for a probe should be sufficient for the sensor being calibrated. Stem conduction, heat flux along the length of the thermometer stem, affects both the reference sensor and the unit being tested.

145

calibrating temperature instruments

The calibration of instruments and sensors must be performed periodically. The ISO quality control system presupposes the quality control of calibration, the calibration of instruments effecting production, regular calibration of sensors and traceable calibration as well as calibration documentation. The level of performance a calibration device needs to have depends on the accuracy requirements determined by each company. However, the calibration device must always be more accurate than the instrument or sensor being calibrated. Calibration of instruments and sensors can be carried out either on site or in a laboratory.

However, the calibration device must always be more accurate than the instrument or sensor being calibrated.

146

Integrated calibration solution – a smarter way to calibrate temperature Beamex has introduced a smarter, more efficient and accurate solution for calibrating temperature. It is a complete solution for temperature calibration with various products and services, such as a series of highquality dry blocks for field and laboratory use, smart reference probes and temperature calibration laboratory services. “The temperature products and services we are now introducing form an integral part of the Beamex® Integrated Calibration Solution, a complete calibration solution that enables faster, more accurate and efficient management of all calibration assets and procedures”, says Raimo Ahola, CEO of Beamex Group. The Beamex® Integrated Calibration Solution concept is the combination of calibrator, calibration software and PC for online calibration. The instrument to be calibrated is connected to the calibrator controlled by a computer, where the computer controls the calibration event. The Beamex® FB and MB dry blocks are part of the Beamex® Intergrated Calibration Solution. The dry blocks communicate with the Beamex documenting multifunction calibrators enabling fully automated temperature calibration and documentation. The calibration results can then be uploaded from the documenting calibrators to the Beamex® CMX Calibration Software. The instrument’s calibration information is saved in the calibrator and History Trend reports, both in numeric and in graphic form. “This helps the client to follow the condition of the instrument, which is useful in making decisions about purchasing new instruments, determining service in advance and recalibration. With the CMX Software, you can print out a calibration report as well as a traceable,

calibrating temperature instruments

accredited calibration certificate. Our integrated calibration solution concept saves valuable time, eliminates any errors related to manual entry and assures repeatable calibration procedures”, Mr Ahola adds.

The Beamex® Integrated Calibration Solution concept is the combination of calibrator, calibration software and PC for online calibration.

147

148

total uncertainty of temperature calibration

Calculating total uncertainty of temperature calibration with a dry block

T  

his article will discuss the various uncertainty components related to temperature calibration using a temperature dry-block. It will also present how to calculate the total uncertainty of a calibration performed with a dry block. What is a temperature dry block? A temperature dry block consists of a heatable and/or coolable metallic block, controller, an internal control sensor and optional readout for external reference sensor. This article will focus on models that use interchangeable metallic multi-hole inserts. There are fast and lightweight dry blocks for industrial field use as well as models that deliver near bath-level stability in laboratory use. There are also some work safety issues that favor dry blocks in preference to liquid baths. For example, in temperatures above 200 °C liquids can produce undesirable fumes or there may be fire safety issues. If a drop of water gets into hot silicon oil, it could even cause a small steam explosion which may splash hot oil on the user. Dry blocks are almost without exception meant to be used dry. Heat transfer fluids or pastes are sometimes used around or inside the insert, but they don’t necessarily improve performance. They may actually even impede the dry block’s performance and damage its internal components.

There are fast and lightweight dry blocks for industrial field use as well as models that deliver near bath-level stability in laboratory use.

EURAMET The EURAMET guideline (EURAMET /cg-13/v.01, July 2007 [previously EA-10/13]): • The Euramet calibration guide defines a normative way to calibrate

149

total uncertainty of temperature calibration

dry blocks. As most of the manufacturers nowadays publish their product specifications including the main topics in the Euramet guide, the products are easier to compare.

The Euramet calibration guide defines a normative way to calibrate dry blocks.

• Main topics in the EURAMET guideline include: – Display accuracy – Axial uniformity – Radial uniformity – Loading – Stability over time – Hysteresis – Sufficient immersion (15 x diameter) – Stem loss for 6 mm or greater probes – Probe clearance (
View more...

Comments

Copyright ©2017 KUPDF Inc.
SUPPORT KUPDF