Amgen Case Study

Share Embed Donate


Short Description

Commissioning and Qualification Program Case Study...

Description

Overview of Amgen’s Commissioning and Qualification Program

Ronald Brunelle Bob Buhlmann Nick Haycocks 1

Agenda •  Overview of Amgen Commissioning and Qualification Program – Ronald Brunelle •  Workshop Warm-up Review of Key Deliverables - Nick Haycocks •  Computer Systems Developing a Risk Based Approach – Bob Buhlmann •  Coffee break 3.20

Overview of Amgen’s Commissioning and Qualification Program

Ronald Brunelle - Quality Assurance Director Amgen 3

Agenda •  Introducing Change •  Highlights of the Program •  Maturing the Program •  Lessons Learned and Next Steps •  Workshop Warm-up - Review of Key Deliverables

Why do this

Goal •  Simplify Validation •  Better, Cheaper, Faster Industry Guidance Utilized •  ASTM E2500 •  ICH

Introducing Change

From Impact Assessments to System Boundaries •  C&Q impact assessments time consuming •  Impact assessment giving inconsistent results across sites •  Component and system impact assessments not helpful in the risk based ASTM 2500 model

Boundary Diagrams •  Boundary diagrams were created for all systems in Amgen •  Allowed a consistent manner for systems and component assessments •  No need to perform the C&Q impact assessment process •  Set the stage for Risk Based approach to qualification

Change from Impact categorization to “Levels” – Mindset Change •  Level 1- Equipment assets within a system where operation or maintenance activities can affect the critical quality attributes, the critical aspects, or the critical process parameters of the product the system delivers.

•  Level 2 - Equipment assets within a system where operation or maintenance activities can have an adverse business impact.

•  Level 3 - Equipment assets not included in the definitions of level 1 or 2.

Highlights of The Program

The ASTM 2500 Standard Guide Provides a Framework for a New Approach to Commissioning and Qualification • 

A science and “riskbased” approach to assure that GMP equipment & systems are: •  Fit for use •  Perform satisfactorily

• 

Amgen interpretation and application of: •  •  •  • 

ASTM-2500 ICH Q9 EU Vol. 4 Annex 15 FDA Process Validation Guidance •  ISPE Baseline Guide Vol. 12: Verification (Draft)

Amgen Document Hierarchy plays key role in change Policy: Governing principle/position that mandates or restricts action.

Standard: Operational principles that are typically translated into other required / controlled documents (SOPs) for how to / execution information

Procedure: Instructions for performing tasks, activities or practices that includes the identification of roles and responsibilities.

Quality Manual GMP Quality Policy GMP Operating Standards

Specifications

Procedures

Records

Using the Document Hierarchy Commercial Operating Standard for Facility Design

GEP SOPs

Commercial Operating Standard for Validation of Level 1 Systems

Automation SOP

Quality Risk Assessment

Site Validation SOPs

Integration of Engineering and Quality can be achieved through Document Hierarchy

GEP Framework Design Review Engineering Change Management

Quality Requirements Concept design stage checks

Quality Requirements Detail design stage checks

GEP review Concept and detail design stage

Automation Project Delivery

Commissioning Planning and Execution

Sample Summary Reports

Sample Commissioning Plan / template

Sample Commissioning Specifications

Facility commissioning sample report

HAVC System commissioning sample report

Pipe System commissioning sample report

Process Equipment and Automation guidance

BMA/BAS System Commissionin g sample report

Security System Commissioning

GEP Framework

Engineering Quality Systems

Engineering Vendor Assessment

Vendor Data Requirements

PDI / FAT Testing

CMMS Data

User Requirements

Instrument Calibration

Construction Inspection

Walk Down and Punch listing

Receipt Verification

Instrument Check Out

Installation Verification

Validation, Qualification, and Commissioning Processes

User   Requirements Risk   Assessment

Design   Qualification

Installation   Development   Inspection  aCommissioning nd   and  Operational   Testing Testing

Process   Validation

SIP  /  PQ

Implement   for  Use

Change   Control

Decomm  -­‐ issioning

Commissioning IQ

Qualification

Cleaning   Validation

OQ

I  OQ

Performance   Monitoring

Validation  

Verification: Activities within any of these processes

Risk assessment controls confirmed

Accept and Release Verification Build / Construct Specifications and Design Requirements Qualification

t ersigh ty Ov Quali

Risk Asse ssme nt an d Co ntrol s

Qualification Process and Quality Oversight

Strategy scope and acceptance criteria for risk controls Risk assessment controls identified Approvals of program standards, GEPs SOPs

Quality Oversight model at full maturity of the Engineering, Automation and Quality processes. Plan

User Requirements

Design

 Validation Plan

 

FAT

Commissioning Plan

Quality Risk Assessments



Build

 Design Reviews





Test

Receipt Verification & Installation Verification



Commissioning Tests - Automation checkout, Performance tests, SAT



Qualification Summary Report



Performance Testing / PQ



Development Testing

Quality Pre and Post Approval



 Quality Review



Quality Approval

Quality Oversight model at early stages of maturity of the Engineering, Automation and Quality processes. Plan

User Requirements



Design

Build

FAT

Commissioning Plan

Quality Risk Assessments

 Validation Plan



 Design Reviews





Test

Receipt Verification & Installation Verification



Commissioning Tests - Automation checkout, Performance tests, SAT



Qualification Summary Report



Performance Testing / PQ



Development Testing



 Quality Pre and Post Approval  Quality Review

Quality Approval

Amgen Relational Model for Qualification ·∙ ·∙ ·∙ ·∙ ·∙ ·∙

SME’s Process Product System Regulatory Quality Vendors

System Specifications

Vendor Assessment

User Requirements Traceability

Risk Assessment

Design Review

Commissioning / Testing

·∙ Vendor Documents ·∙ Good Engineering Practices ·∙ Change Management

Qualification report

Performance Qualification

Feed into ongoing maintenance management systems

Maturing the Program

Process Maturity

Qualification

Engineering

Integrated Processes

Quality

Process

Overall Maturity We are here

And here

Early Adaptation

• Some or few of the program elements incorporated. • Quality pre- and post- approval of most qualification documents • Risk based approach not fully developed or well understood. • Use and overlap of existing processes (SOPs, etc.) • Validation and engineering integration not instituted

Transitional

• Many but not all program elements incorporated • Quality pre- and post- approval of most qualification documents • Risk based approach not fully utilized and consistent • Reduction and movement away from existing processes (SOPs, etc) • Validation and engineering integration in progress

Mature

• Most or all program elements properly implemented • Quality approvals focused on critical aspects as defined in mature program • Risk based approach fully utilized and consistent • Existing processes (SOPs, etc) eliminated or completely aligned with program • Validation and engineering integration fully realized

24

Assessing Maturity PROGRAM ITEM

MATURITY STAGE EARLY ADAPTATION

User requirements Planning – Validation and Commissioning Supplier Assessment / Use of Supplier documentation Automation Requirements and Specifications and Testing Engineering Standards & Specifications Risk Assessment Design Review Commissioning Calibration Confirmation of Qualification Quality Oversight

TRANSITIONAL STAGE

MATURE STAGE

Maturity - Commissioning We are here

And here

Early Adaptation

•  No predefined commissioning strategy •  Company documents developed independently of vendor documents •  Limited use of SMEs

Transitional

•  Defined commissioning strategy •  Partial integration of testing, multiple test documents. •  Some duplication of testing •  Some use of SMEs

Mature

•  Defined integrated test strategy •  Acceptance criteria for Quality and Engineering identified •  Extensive reliance and accountability on SMEs •  Commissioning considered as part of qualification

26

Assessing Maturity Program item

User requirements

Maturity stage of program as defined by program item EARLY ADAPTATION

TRANSITIONAL STAGE

New Systems •  URS not implemented or inconsistent •  Site SOP and or template utilized •  Use of ‘other docs’ such as R/D •  Project User Requirements only

New Systems •  URS implemented but inconsistent •  Site SOP and or template utilized •  Use of ‘other docs’ such as R/D •  GEP SOP referenced

Legacy Systems •  Guidance in OS- 000034 •  Use of Change control records •  Update existing R/D docs or other spec documentation

Legacy Systems •  Guidance in OS-000034 •  Use of Change control records •  Update existing R/D docs

MATURE STAGE New Systems •  URS consistently implemented •  Approved SOP and Template •  Access to library of existing URSs

Legacy Systems •  Guidance in OS-000034 •  GEP SOP enforced •  Approved SOP and Template •  Access to library of existing URSs

27

Lessons Learned - Wins •  One site inspection by agency - the process not challenged •  Resulted in focus of Quality on critical aspects •  Different groups understand and appreciate roles and interactions more clearly •  Faster implementations and more robust systems have been recognized •  System based vs. component based thinking 28

Lessons Learned - Challenges •  Expectations around documentation practices •  GEPs not well established and understood •  Comfort zones are challenged and resistance varies •  Use escalation process for resistance

29

Maturing the Process •  Engineering is intensifying their practices and delivery models to provide more robust systems •  Alignment of procedures and practices between groups (Engineering, Automation, Validation) •  Understanding and accountability of SME Role •  Legacy systems applicability •  Intensify training for risk assessments •  Individual site assessments to ensure maturity progression

30

Benefits Beyond Large Capital Projects Project Scope Plan

Design

Build

Operate & Maintain

Test

Adopted Practices SME’s ·∙   Process ·∙   Product ·∙   System ·∙   Regulatory ·∙   Quality ·∙   Vendors

Additional Value • 

User Requirements

Quality Risk Assessment

Additional Benefits

•  •  • 

Change Control Assessments Failure Event Impact Assessments Preventive Maintenance Others?

Thank You! Ronald Brunelle [email protected]

32

Workshop Warm-up Review of Key Deliverables

Nick Haycocks - Quality Assurance Senior Specialist 33

Review of Key Deliverables •  User Requirements •  Risk Assessment •  Qualification Summary Report

User Requirements - Purpose Serves as: •  The guiding document for the engineering design, commissioning, and qualification process for a system. •  A focal point for documenting and communicating requirements. •  It is maintained throughout the lifecycle of the system.

User Requirements Structure Sections include: •  Performance Requirements •  Design Requirements •  Operational Requirements •  Automation Requirements •  Miscellaneous Requirements

User Requirements Structure ID No

Category and Parameter

Requirement

Type

Source

This is a good requirement (for a granulator) The system must be capable of a chopper speed range of 1500 and 3000 rpm ±10%.

Quality

SPP

1 Performance 1.1

Process

1.2

General

This is an appropriate requirement The system must include a scale capable of measuring the 100 Liter tank Quality and contents with an accuracy of ± 0.1kg

SPP

2 Automation General 2.1

3 ETC

This requirement is inappropriate: Business The control system must be operable in Quality / automatic or manual modes –manual will QRAES allow stepping through the control sequence for each recipe. Why: Standard functionality should not be specified within URS. This requirement should be defined in the detailed design documents.

Quality Risk Assessment (Quality Risk Assessment for Equipment & Automated Systems) •  Risk assessment method for assessing and evaluating quality risks as they pertain to equipment and automated systems •  Utilizes Severity, Likelihood, Detection •  Business risks and safety risks are not part of the Quality Risk Assessment

Applicability of the Quality Risk Assessment •  To identify the equipment design and performance elements that impact or control the system quality attributes •  To provide a quality focus within the design, commissioning, and testing activities •  To assess the impact of changes to legacy equipment and automated systems

Requisites •  System Quality Attributes and System Process Parameters •  Draft or Final User Requirement Document or Equivalent •  Appropriate SME’s •  Preliminary or Final Design Specifications

Quality Risk Assessment •  SQA’s - A physical, chemical, biological, or microbiological property or characteristic of the system output that should be within an appropriate limit, range, or distribution. •  SQA’s should always be able to be mapped to: Safety, Identity, Strength, Quality, Purity

•  SPP’s - A process parameter whose variability has an impact on the SQA and therefore should be monitored or controlled to ensure the process produces the desired quality. •  SPP’s should always be able to be mapped to the SQA’s

Compressed Air Example – SQA’s; SPP’s System Quality Attributes (SQAs) •  ?? •  ?? •  ?? System Process Parameters (SPPs) •  ?? •  ?? •  ??

Compressed Air Example – SQA’s; SPP’s System Quality Attributes (SQAs) •  Particle count – viable and non viable •  Hydrocarbon content particulate and vapor •  Moisture content System Process Parameters (SPPs) •  Pressure? •  Flow?

2

Purity

N/A

Particulate   contamination   (metallic)

N/A

Potential   contamination  of   the  s ystem  product

N/A

Breakdown  of  the   product  c ontact   materials  of   construction   (metallic)  -­‐  due  to   the  product,   excipients  or   cleaning   materials.

Specification  of   appropriate   materials.

N/A

The  s pecification   includes  the   requirement  to   clean  a nd   passivate  the   system  prior  to  

PM's  will  i nclude   the  r equirement  to   inspect  the   equipment  during   maintenance  a nd   report  a ny   degradation  or   corrosion  found.  

N/A

Compliance  with   the  s pecifications   for  the  product  or   ingredient  c ontact   materials  will  be   confirmed  through   installation   verification  (IV). Operational   verification  (OV)   will  c onfirm  that   the  s ystem  was   cleaned  a nd   passivated  prior  to  

What   mechanisms   would  detect   the  result  if  the   risk  occurred?

Installation   Verification  (IV)   could  not  be   completed  i f  there   are  deviations   from  the   specifications,  that   cannot  be  r eviewed   and  a ccepted   through   engineering  c hange   management. The  r eport  from   maintenance  of   any  damage,   degradation  or   corrosion  found.

Detection

Occurrence

Source  or  cause   of  the  hazard   (how  can  the   hazard  occur?)

Design  features    Procedures   Activities   to  control  the   required  to   needed  to  verify   likelihood  of   control  the   the  risk  controls   hazard   likelihood  of   are  in  place   occurrence   hazard   working  as   (Instrumentatio occurrence   specified   n,  alarms  and   (SOPs,  training   (development   interlocks).   manuals) testing,  IV,  OV)

 Risk   Level

Comments

Impacted  Systems

1

SQA

Effect  /   Potential  hazard   consequence  of   to  SQA the  hazard

Severity

Reference  No.

Quality Risk Assessment (QRAES)

Consider   Rouge  

N/A

N/A

N/A

Qualification Summary Report •  Confirms that identified risk controls were tested and working properly •  Confirmation that commissioning has been successfully completed •  Provides traceability of testing activities to QRAES, verification documentation, and requirements / specifications

Example - Summary Report TABLE OF CONTENTS 1. PURPOSE 2. SCOPE 3. SYSTEM DESCRIPTION 4. APPROACH 5. REFERENCES 6. RESPONSIBILITIES 6.1 Operations 6.2 Validation 6.3 Quality 7. System Verification Summary 7.1 System Design Documentation 7.2 System Engineering Management Review 7.3 Quality Risk Assessment for Validation Traceability Verification 8. DISCUSSION 9. CONCLUSION

Thank You! Nick Haycocks – [email protected]

47

Computer Systems – Developing a Risk Based Approach

Bob Buhlmann - Director Corporate Quality Assurance Amgen, Inc.

Agenda •  GAMP •  Risk Management Process •  Functional Risk Assessment

•  Introduce Data Integrity Assessments •  •  •  •  • 

Why What Do NOT When – SDLC Uses and Benefits

•  Summary

49

Data Integrity Assessment •  Is the concept for delivering a risk based approach to computer system validation •  How to identify which data and records are important •  Is a shift for Quality into the Data of a System and less on the functional testing of the system

50

GAMP – Risk Management Process •  System GxP Determination •  What is the Overall System Impact

Detailed Assessment Required?

N

•  Implement Required Controls •  Verify Controls •  Review Risk/Monitor Control

Y •  Identify Functional Risks •  Assess Functional Risks •  Identify Required Controls

51

GAMP – Functional Risk Assessment Product Quality

Data Integrity

Patient Safety

•  Should be used to identify and manage risks to patient safety, product quality, and data integrity that arise from failure of the function on consideration •  Identify functions with impact patient safety, product quality, and data integrity through the User Requirements Specifications and Functional Specifications 52

How do you provide for Product Quality and Patient Safety ? Data Integrity

Product Quality

Patient Safety

•  The data of Computer Systems contains the risks that relate to product quality and patient safety

53

Data Integrity Assessments – Why •  The growing need for electronic data: •  Globalization •  Operational Efficiency •  Paperless

•  Companies need to: •  Validate systems for “Intended Use” •  Validate systems to ensure Data Integrity •  Meet Regulatory Requirements (e.g., Part 11, SOX)

•  In preparation for FDA’s Part 11 inspection initiative Computer Systems and deployments have grown in complexity

54

Data Integrity Assessments – Why -cont. •  Ensure Data Integrity and identify controls to prevent: •  Inaccurate, incomplete, or missing data •  Uncontrolled data modification (e.g., no recorded reason) •  Cross outs of data

•  Evaluate high impact computer systems for: •  Electronic records identification and documentation •  Data flow and Audit trails •  Data backup/restore/archival •  Security and Training •  Electronic signatures Critical Quality Parameter = Data (integrity)

55

Data Integrity Assessments – What •  High impact to product quality data systems •  (e.g., Lab instruments, LIMS, CDS, MES)

•  Identify design elements that impact or control the system quality attributes (data) in the area of: •  •  •  • 

Data Output Data Change/Entry Data Security Data Backup

•  Provide a ‘quality focus’ within the design, testing, operations, and periodic review activities •  Provide a basis for assessing the impact of changes to systems

56

Data Integrity Assessments – Do NOT •  Determine the need or level (Rigor) of testing •  Identify the need to perform a Supplier Assessment •  Identify Project Related Risks •  Schedule Impacts •  Resource Constraints •  Financial Related Risks

•  Safety Related Risks (e.g., Data Center) 57

Data Integrity Assessments – SDLC Build/ Configure Design

Plan

Test

Data Integrity Assessment

Maintain

58

Data Integrity Assessments Uses and Benefits •  Determine/Define the risk and controls for the system’s electronic data: •  Defines critical data fields •  Identifies data that requires review

•  Ensure systems remain Validated for Intended Use: •  Requirements/Design Specification(s) Completeness •  Supports the periodic review process •  Provide continual improvement •  Supports the Change Control Process 59

SUMMARY •  Reviewed GAMP for the need for data assessments •  Data Integrity provides assurance of product quality and patient safety •  Foundation for conducting periodic reviews

60

Thank You! Bob Buhlmann

61

Technique for Data Integrity Assessment •  Question: •  If reprocessing is performed, or if data is modified, describe how and when someone would detect if it was performed and by what means would it be detected

•  Response: •  Reprocessing of data in the system is called reactivation. Reactivation of the data is performed if there is any correction to be made to the data (results) that is already approved; this reactivation process and impact assessment of the reactivation of data entities in LIMS is controlled and governed by procedures.

62

Data Changes - Example #

Action Required What if (Risk)

Effect

Controls to Reduce and Detect Risk Testing and Incident management

Invalid Results are Approved by the Approver

Test Results: Invalid results will be printed on the C of A

Procedural control: C of A review process will catch the error. Batch is reactivated with a reason code. Sample Reactivated with a reason code. Test Reactivated with a reason code. Test is rejected by the approver with a reason and Batch reapproved. Per SOP-XXXXXX The result updates are captured in the audit trail that is reviewed in the Lot review process per SOP-XXXXXX. System Control: The results that are out of range are in purple font and crossed out with purple color. Visual recognition for Out of range results.

63

Technique for Data Integrity Assessment •  Question: •  Describe how (electronic/hardcopy) results are reviewed/ approved and by whom. What data is being reviewed/ approved (i.e. Sample ID, Lot Number, particle size).

•  Response: • An Authorized user logs samples and generates Sample Labels and Sample Collection work list . Once the sample plans/ Jobs are logged the following are printed and sent to the client for sample collection. •  Sample Labels •  Mfg, EM and Water and Stability Collection Work lists

• Samples are collected and the chain of custody is initiated and Samples are received into Sample Management area and distributed to the Lab. 64

Data Output - Example

# What if (Risk)

Controls to Reduce and Detect Risk

Effect

Action Required

Sample Management Sample Labels are reprinted

Label Control: Multiple sets of Labels for the same sample Labels would be printed and applied to samples that would be tested and if the results did not pass, the label could be reprinted on a second sample that would then be tested.

System control: Once the chain of custody is initiated on the sample, custody cannot be re initiated again on the same sample. Also once the custody is started on a particular sample; it will drop off of the collection work list. Samples are scanned in using a Bar Code and no label can be scanned in more than 1 time. The Event Log captures how many labels of been printed for each label. Procedure

65

View more...

Comments

Copyright ©2017 KUPDF Inc.
SUPPORT KUPDF