ISA Certified Automation Professional CAP

April 26, 2017 | Author: Luis Guillermo Rodríguez A. | Category: N/A
Share Embed Donate


Short Description

Download ISA Certified Automation Professional CAP...

Description

ISA Certified Automation Professional (CAP) Job Analysis Study 2004

Notice The information presented in this publication is for the general education of the reader. Because neither the author nor the publisher has any control over the use of the information by the reader, both the author and the publisher disclaim any and all liability of any kind arising out of such use. The reader is expected to exercise sound professional judgment in using any of the information presented in a particular application. Additionally, neither the author nor the publisher have investigated or considered the effect of any patents on the ability of the reader to use any of the information in a particular application. The reader is responsible for reviewing any possible patents that may affect any particular use of the information presented. Any references to commercial products in the work are cited as examples only. Neither the author nor the publisher endorses any referenced commercial product. Any trademarks or tradenames referenced belong to the respective owner of the mark or name. Neither the author nor the publisher makes any representation regarding the availability of any referenced commercial product at any time. The manufacturer's instructions on use of any commercial product must be followed at all times, even if in conflict with the information in this publication. Copyright © 2004 ISA–The Instrumentation, Systems and Automation Society 67 Alexander Drive P.O. Box 12277 Research Triangle Park, NC 27709 All rights reserved. ISBN: 1-55617-903-0 No part of this work may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior written permission of the publisher. Library of Congress Cataloging-in-Publication Data in Progress

Table of Contents Introduction ...............................................................................................................................................3 Phase I: Initial Development and Evaluation ............................................................................................4 Phase II: Validation Study.........................................................................................................................5 I.

Questionnaire Design, Sampling Plan, and Distribution..........................................................5

II. Characteristics of the Sample .................................................................................................5 III. Evaluation of Performance Domains......................................................................................20 A. Validation Scales.........................................................................................................20 B. Panelists’ Evaluations .................................................................................................21 C. Respondents’ Evaluations ..........................................................................................22 D. Comparison of Panel Members’ and Respondents’ Evaluations ................................24 E. Survey Respondent Subgroups’ Evaluations..............................................................25 IV. Reliability Analysis for Domain Scales...................................................................................33 V. Delineation of Required Knowledge and Skills .......................................................................33 VI. Summary of Results...............................................................................................................34 V. Conclusion ..............................................................................................................................34 Phase III: Test Specifications..................................................................................................................35 Domain, Task, and Knowledge and Skill Statements..................................................................36 Appendix A: Contributors for the Practice Analysis Study ......................................................................59 Appendix B: Other Responses................................................................................................................60 Appendix C: Major/Focus of Highest Degree..........................................................................................62 Appendix D: Job Analysis Survey ...........................................................................................................63

Introduction ISA- The Instrumentation, Systems, and Automation Society works to protect the public by identifying individuals who are competent to practice in several related career fields. Consistent with this mission, the intended function of the ISA Certified Automation Professional (CAP) examination program is to assess competence in the automation professional. Passing scores on the examination indicate that the Certified Automation Professional has achieved a level of ability consistent with requirements for competence on the job. The development of a quality credentialing or licensing examination must follow certain logically sound and well-researched procedures. These principles and methods are outlined in federal regulation (Uniform Guidelines on Employee Selection Procedures) and manuals, such as Standards for Educational and Psychological Testing (published by the American Educational Research Association, 1999), and Standards for Accreditations of Certification Programs (published by The National Commission for Certifying Agencies, 2002), as well as standards set by American National Standards Institute (ANSI). Through its relationship with CASTLE Worldwide, Inc., ISA follows these standards in developing examinations for its credentialing program. The guidelines hold that it is necessary to determine the knowledge and skills needed to be a competent practitioner in the field in order to develop a practice-related examination. The process for identifying these competency areas includes a job analysis study, which serves as a blueprint for examination development. A job analysis also helps to determine the type of examination, such as multiple-choice, to be developed in order to assess essential competence in the most appropriate manner. The critical reason for conducting a job analysis study is to ensure that the examination has content validity. In psychometric terms, validation is the way a test developer documents that the competence to be inferred from a test score is actually measured by the examination. Content validity is the most commonly applied and accepted validation strategy used in establishing certification examinations. A content-valid examination for ISA’s Certified Automation Professional program, then, appropriately evaluates knowledge and skill required to function as a competent practitioner in the automation profession. A content-valid examination in automation contains a representative sample of items that measure the knowledge and skills essential to the job. The job analysis study is an integral part of ensuring that the examination is content-valid—that the aspects of automation covered on the examination reflect the tasks performed in the range of practice settings throughout the United States and Canada. For both broad content areas and tasks, the study validates importance and criticality to practice. These ratings play an important role in determining the content of the examination. The ISA Certified Automation Professional practice analysis study consisted of the following three phases, which are the focus of this report: I.

Initial Development and Validation. In January 2004, a panel of 15 experts assembled by ISA met in Research Triangle Park, NC, with representatives from CASTLE Worldwide, Inc., to define the essential elements of the profession of automation. The panel identified the domains, tasks, knowledge, and skills consistent with this purpose.

II.

Validation Study. A representative sample of 1,500 practicing automation professionals was asked to review and validate the work of the job analysis panel.

III.

Development of Test Specifications. Based on the ratings gathered from the representative sample of automation professionals, the test specifications for the examination were developed.

ISA Certified Automation Professional Job Analysis Study

3

PHASE I INITIAL DEVELOPMENT AND EVALUATION Since 1996, ISA has offered a well-recognized certification program for control systems technicians. Certified Control System Technicians (CCSTs) work in a variety of industries to monitor and calibrate devices that control the manufacturing process. In 2004, ISA began the first steps in the development of a new credentialing program for Certified Automation Professionals. The first steps in analyzing the automation profession included the identification of the major content areas or domains, the listing of tasks performed under each domain, and the identification of the knowledge and skills associated with each task. To conduct the study, ISA assembled a 15-member panel of automation experts to discuss the practice. The panel members represented automation professionals practicing in various job settings, all geographic regions of the United States, and various experience levels as well as educators. A complete list of panel members is provided in Appendix A. The following steps were undertaken to complete Phase I: A. The panel determined that the profession could be divided into six major domains of practice. The six domains of practice denote major responsibilities performed by automation professionals. These performance domains are: 1. 2. 3. 4. 5. 6.

Feasibility Study Definition System Design Development Deployment Operation and Maintenance

B. Next, the panel delineated essential tasks in each of the six domains. The tasks define the domains and focus the automation professional on public safety, health, and welfare. The panel subsequently generated a list of knowledge and skills required to perform each task. C. The panel members then evaluated each performance domain and task, rating each on importance and criticality to the automation practice. Based on the work of the panel of experts, CASTLE developed a electronic survey and distributed it to a sample of automation professionals. The results of the survey are the focus of Phase II.

ISA Certified Automation Professional Job Analysis Study

4

PHASE II VALIDATION STUDY I. Questionnaire Design, Sampling Plan, and Distribution Using the domains and tasks identified by the panel of experts, CASTLE developed an electronic questionnaire to be completed by a sample of automation professionals. ISA provided CASTLE with a list of 1,500 names of professionals in the automation field. CASTLE distributed the questionnaire to these 1,500 professionals to consider, rate, and provide other feedback on the domain and task lists delineated by the panel of experts. The questionnaire also solicited biographical information from the respondents in order to ensure a representative response and completion by appropriately qualified individuals. Of the 1500 individuals who were asked to participate online, 219 submitted usable responses. Discounting undeliverable e-mail addresses, out of office individuals, individuals unable to log into the survey, and individuals opting out of the survey, the overall response rate was 14.95%. Given that the survey required approximately 20 minutes to complete and that it was unsolicited, the response rate achieved is reasonable. Not all individuals responded to every question, therefore, the total number of responses per question may vary. II. Characteristics of the Sample The characteristics of the sample are important as a means to assess the degree to which the group of respondents represents the automation profession along key dimensions. The panel of experts discussed key variables that might have an impact on how members of the profession view their work and developed 14 questions that accounted for them. Survey respondents were asked to provide this information by responding to the questions. The following tables summarize the information provided by survey respondents. Due to the fact some respondents elected not to respond to the various questions, the frequencies reported below do not total the number of respondents.

ISA Certified Automation Professional Job Analysis Study

5

Table I. Gender As shown in the chart and graph below, the majority of respondents (203, or 94.4%) are male.

GENDER Frequency

Percent

Male

203

94.4

Female

12

5.6

TOTAL

215

100.0

GENDER 300

Number of Respondents

200

100

0 Male

ISA Certified Automation Professional Job Analysis Study

Female

6

Table II. Age As shown in the chart and graph below, the majority of the sample was more than 40 years old. Thirteen individuals (6%) reported their age as under 30 years old.

AGE Frequency

Percent

Under 30 years

13

6.0

31- 40 years

79

36.7

41-50 years

82

38.1

51-60 years

34

15.8

61 years and above

7

3.3

TOTAL

215

99.9*

*Due to rounding, percentage totals may not always equal 100.

AGE 100

80

Number of Respondents

60

40

20

0 Under 30 years

41-50 years 31-40 years

ISA Certified Automation Professional Job Analysis Study

61 years and above 51-60 years

7

Table III. Location As shown in the graph below, states were grouped into geographic regions. All regions were represented in the sample.

Alaska

1

3 2

4 5 Puerto Rico Ha waii

LOCATION

ISA Certified Automation Professional Job Analysis Study

Frequency

Percent

1

17

8.5

2

50

25.0

3

33

16.5

4

53

26.5

5

47

23.5

Total

200

100.0

8

Table IV. Level of Experience The table and graph below present the status of the respondents according to the years of experience they reported. As evidenced by the table and graph, the respondents tended to be very experienced in the automation profession with 97 individuals (45.1%) reporting more than 15 years of experience in the field. YEARS OF EXPERIENCE Frequency

Percent

I’m not an automation professional

3

1.4

Less than 1 year

3

1.4

1-5 years

19

8.8

6-10 years

47

21.9

11-15 years

46

21.4

More than 15 years

97

45.1

TOTAL

215

100.0

EXPERIENCE 120

100

Number of Respondents

80

60

40

20

0 not an AP

1-5 years

Less than one year

ISA Certified Automation Professional Job Analysis Study

11-15 years 6-10 years

More than 15 years

9

Table V. Percentage of Time Spent Working as an Automation Professional in Current Position The respondents were asked to provide the percentage of their time spent working as an automation professional in their current position. Over half of the respondents (65.6%) reported spending 76 to 100 percent of their time working as an automation professional in their current position

PERCENT OF TIME SPENT Frequency

Percent

I’m not an automation professional

4

1.9

Less than 25 percent

6

2.8

25-50 percent

27

12.6

51-75 percent

37

17.2

76-100 percent

141

65.6

TOTAL

215

100.1*

*Due to rounding, percentage totals may not always equal 100.

PERCENT OF TIME SPENT 160 140 120

Number of Respondents

100 80 60 40 20 0 Not an AP

25-50 percent

Less than 25 percent

ISA Certified Automation Professional Job Analysis Study

76-100 percent 51-75 percent

10

Level VI: Control Areas Worked in on a Daily Basis The majority of the respondents reported working in both discrete/machine control and process/liquid/dry control areas on a daily basis. PROCESS AREAS Frequency

Percent

Discrete (Machine Control)

16

7.5

Process (Liguid, dry)

47

22.0

Both Discrete and Process

151

70.6

TOTAL

214

100.1*

*Due to rounding, percentage totals may not always equal 100.

CONTROL AREA

Number of Respondents

200

100

0 Discrete

ISA Certified Automation Professional Job Analysis Study

Process

Both

11

Table VII. Primary Responsibility in Current Position The majority of respondents (73.3%) reported that Project/Systems Engineering was their primary responsibility in their current position. PRIMARY RESPONSIBILITY Frequency

Percent

Field Engineering

0

0.0

Information Systems

5

2.5

Operations and Maintenance

24

11.9

Project/Systems Engineering

148

73.3

Other

25

12.4

TOTAL

202

100.1*

*Due to rounding, percentage totals may not always equal 100.

PRIMARY RESPONSIBILITY 160 140 120

Number of Respondents

100 80 60 40 20 0 Information Systems

Project/System Eng

Operations & Maint

ISA Certified Automation Professional Job Analysis Study

Other

12

Table VIII: Industry Worked In Respondents were asked to select the responses that best described the industry in which they worked. The responses are provided in the table below and the chart on the following page. INDUSTRY Frequency

Percent

Frequency

Percent

Aerospace

1

0.5

Metals Manufacturing

3

1.4

Automotive Manufacturing

4

1.9

Petroleum Manufacturing

12

5.6

Building Automation

6

2.8

Pharmaceutical Manufacturing

27

12.6

Chemical Manufacturing

25

11.7

Plastics Manufacturing

4

1.9

Consumer Goods

6

2.8

Pulp and Paper Manufacturing

5

2.3

11

5.1

Textiles/ Fabrics Manufacturing

0

0.0

26

12.1

Transportation

2

.9

Environmental/ Waste

0

0.0

Utilities

16

7.5

Food and Beverage Manufacturing

19

8.9

Water/waste

15

7.0

Machinery Manufacturing

10

4.7

Other

22

10.3

214

100.0

Electrical /Electronic Manufacturing Engineering and Construction

TOTAL

ISA Certified Automation Professional Job Analysis Study

13

20

10

0

Number of Respondents

INDUSTRY 30

e er st th a O r/w f e n nu at W i es ati o Ma it rt er til U spo ap P f nu f an d Tr an anu Ma l p M cal Pu ti cs uti uf as ce n Pl rma Ma a m f Ph l eu nu uf f t ro Ma an nu Pe ls M M a a y et er ge M hin ra n e o ac v ti M /B e truc od ns nu F o Co M a s g / i c od En tron Go f r ec e nu n El um Ma ti o a s l on a m C mi c uto uf n he A a C i ng e M i ld tiv Bu mo t o ace Au s p ro Ae

14

ISA Certified Automation Professional Job Analysis Study

Table IX. Current Employer’s Company or Organization The table and graph below present the status of the respondents according to their current employer’s company or organization. As shown below, the greatest number (82, or 38.1%) of respondents reported their current employer is best described as end-users. Only 13 individuals, or 6.0%, responded that their employer did not fit a listed category. CURRENT EMPLOYER Frequency

Percent

Control Systems Suppliers

15

7.0

End-Users

82

38.1

44

20.5

22

10.2

Systems Integrators

39

18.1

Other

13

6.0

TOTAL

215

99.9*

Engineering and Design Firm Original Equipment Manufactur er (OEM)

*Due to rounding, percentage totals may not always equal 100.

CURRENT EMPLOYER 100

80

Number of Respondents

60

40

20

0 Control Sys Supplier

Eng/Design Firm

End-Users

ISA Certified Automation Professional Job Analysis Study

Systems Integrators OEM

Other

15

Table X. Certifications/Licenses Respondents were asked to indicate which, if any, certifications and licenses they held.

CERTIFICATIONS/LICENSES Frequency CEM

1

CQE

1

CCST

2

CSE

10

MSCE

2

PE

51

PMP

3

Other

22

ISA Certified Automation Professional Job Analysis Study

16

Table XI. Professional Societies and/or Organizations Respondents were also asked to provide which, if any, professional societies, they were a member of.

ORGANIZATION MEMBERSHIP Frequency AIChE

13

ASME

3

CSIA

13

IBEW

27

IEEE

27

ISA

124

UA

1

Other

32

ISA Certified Automation Professional Job Analysis Study

17

Table XIII. Level of Education The table and chart below show that a significant majority of respondents (62.6%) reported their highest level of education as the bachelor degree. Respondents were also asked to provide the major/focus of their highest degree. The responses are provided in Appendix C.

HIGHEST LEVEL OF EDUCATION Frequency

Percent

High school/Secondary school

15

7.0

Associate Degree

22

10.3

Bachelor’s Degree

134

62.6

Master’s Degree

36

16.8

Doctoral Degree

3

1.4

Other

4

1.9

TOTAL

214

100.0

HIGHTEST LEVEL OF EDUCATION 160 140 120

Number of Respndents

100 80 60 40 20 0 High school

Bachelor's Degree

Associate degree

ISA Certified Automation Professional Job Analysis Study

Doctoral Degree

Master's Degree

Other

18

Table XIV. Annual Income The responses for annual income are provided in the table and graph below. Only three individuals (1.4%) reported earning an annual income level of less than $20,000 while 28 individuals (13.4%) reported earning an annual income level greater than $110,000. ANNUAL INCOME Frequency

Percent

Less than $20,000

3

1.4

$20,000 - $49,999

20

9.6

$50,000 - $79,999

83

39.7

$80,000 - $110,000

75

35.9

More than $110,000

28

13.4

TOTAL

209

100.0

ANNUAL SALARY 100

80

Number of Respondents

60

40

20

0 Less than $20,000

$50,000 to $79,999

$20,000-$49,999

ISA Certified Automation Professional Job Analysis Study

More than $110.000

$80,000 to $110,000

19

III. Evaluation of Performance Domains A. Validation Scales. The panel of experts reviewed a number of scales that are often used in job analysis and other validation studies for the purpose of collecting data that would account for how members of the profession evaluate the domains and tasks. In making its selection, the panel considered which scales seemed most appropriate for the automation profession and the purpose of the study. After considerable discussion and rehearsal using the scales, the panel selected three, one for importance, one for criticality and one for frequency. These scales then were used to collect preliminary validation data from members of the panel of experts and final validation data from survey respondents. Participants (panel members and survey respondents) were asked to use four-point scales to express their evaluation of the importance and criticality for each performance domain and task, with a “4” representing the highest rating. The scale anchors for importance and criticality are listed below as a reference. The description for frequency is also provided below. Importance Participants were asked to rate each domain on a rating of importance, or the degree to which knowledge in the domain is essential to the minimally competent practice of interior design. The rating anchors are provided below. 1. Slightly Important. Performance of tasks in this domain is only slightly essential to the job performance of the certified automation professional . 2. Moderately Important. Performance of tasks in this domain is only moderately essential to the job performance of the certified automation professional. 3. Very Important. Performance of tasks in this domain is clearly essential to the job performance of the certified automation professional. 4. Extremely Important. Performance of tasks in this domain is absolutely essential to the job performance of the certified automation professional. Criticality Participants were asked to rate each domain on a scale for criticality, or the degree to which adverse effects (of some type) could result if the certified automation professional is not knowledgeable in the domain. The rating anchors are provided below. 1. Minimal or No Harm. Inability to perform tasks within this performance domain would lead to error with minimal adverse consequences. 2. Moderate Harm. Inability to perform tasks within this domain would lead to error with moderate adverse consequences. 3. Substantial Harm. Inability to perform tasks within this domain would lead to error with substantial adverse consequences. 4. Extreme Harm. Inability to perform tasks within this domain would definitely lead to error with severe consequences. Frequency Participants were asked to provide the percent of time the certified automation professional spent performing the duties associated with each domain. Directions in the survey required respondents to ensure that percentages given for each domain added to 100%.

ISA Certified Automation Professional Job Analysis Study

20

B. Panelists’ Evaluations. The panelists’ ratings of importance of the domains is provided below. The mean ratings ranged from 1.69 to 3.92 on the four point scale.

IMPORTANCE Domain

Sample Size (N)

Mean

Standard Error of Mean

Standard Deviation

I.

Feasibility Study

14

1.69

.1929

.722

II.

Definition

14

2.62

.1972

.738

III.

System Design

14

3.54

.1993

.746

IV.

Development

14

3.92

.1267

.474

V.

Deployment

14

3.38

.1972

.738

VI.

Operation and Maintenance

14

2.54

.2696

1.009

The panelists rated the criticality of the domains as seen in the table below. Domain V (Deployment) was the area seen as having the greatest potential for harmful results if the automation professional were not competent in the domain. CRITICALITY Domain

Sample Size (N)

Mean

Standard Error of Mean

Standard Deviation

I.

Feasibility Study

14

1.77

.2380

.890

II.

Definition

14

2.77

.2993

1.120

III.

System Design

14

3.31

.1929

.722

IV.

Development

14

3.62

.1670

.625

V.

Deployment

14

3.69

.1619

.606

VI.

Operation and Maintenance

14

2.46

.3251

1.216

As shown in the table on the following page, the panelists reported spending the least amount of time in Domain I (Feasibility Study) and the most time in Domain IV (Development).

ISA Certified Automation Professional Job Analysis Study

21

FREQUENCY Domain

Sample Size (N)

Mean

Standard Error of Mean

Standard Deviation

I.

Feasibility Study

15

6.21

.7444

2.883

II.

Definition

15

12.86

2.1971

8.509

III.

System Design

15

25.36

3.9480

15.291

IV.

Development

15

26.43

1.5758

6.103

V.

Deployment

15

17.86

1.8070

6.999

VI.

Operation and Maintenance

15

12.00

2.1536

8.341

C. Survey Respondents’ Evaluations. Survey respondents employed the scales for importance, criticality, and frequency to evaluate all domains and tasks. Their responses are summarized in the tables on the following page. As depicted in the table that follows, survey respondents indicated that all domains are very important. Domain III (System Design) was seen as the most important of the six domains. Domain II (Definition) was considered the second-most important, followed closely by Domain IV (Deployment). Domain VI (Operation and Maintenance) was considered to be the least important, although it was considerably higher than the scale mid-point. IMPORTANCE Domain

Sample Size (N)

Mean

Standard Error of Mean

Standard Deviation

I.

Feasibility Study

217

3.03

.0540

.796

II.

Definition

217

3.35

.0470

.692

III.

System Design

217

3.50

.0424

.625

IV.

Development

217

2.99

.0586

.863

V.

Deployment

217

3.12

.0499

.736

VI.

Operation and Maintenance

217

2.58

.0590

.869

ISA Certified Automation Professional Job Analysis Study

22

The respondents considered Domain III (System Design) as the most critical of the six domains; followed closely by Domain V (Deployment). Domain I (Feasibility Study) was seen as the least critical, although it too is well above the scale mid-point. CRITICALITY Domain

Sample Size (N)

Mean

Standard Error of Mean

Standard Deviation

I.

Feasibility Study

217

2.43

.0608

.896

II.

Definition

217

2.79

.0515

.758

III.

System Design

217

3.32

.0491

.723

IV.

Development

217

3.04

.0542

.798

V.

Deployment

217

3.21

.0498

.734

VI.

Operation and Maintenance

217

2.48

.0606

.893

The panelists rated Domain III (System Design) as being the most frequency performed while Domain VI (Operation and Maintenance) was rated as being performed the least often. FREQUENCY Domain

Sample Size (N)

Mean

Standard Error of Mean

Standard Deviation

I.

Feasibility Study

212

10.29

.4965

7.229

II.

Definition

212

14.55

.5179

7.540

III.

System Design

212

27.06

.7265

10.578

IV.

Development

212

23.92

.8668

12.621

V.

Deployment

212

14.37

.5050

7.353

VI.

Operation and Maintenance

212

9.82

.6373

9.279

ISA Certified Automation Professional Job Analysis Study

23

D. Comparison of Panel Members’ and Respondents’ Evaluations. The evaluations of domains by the panel of experts were compared to the ratings of survey respondents to determine if the results were similar. As depicted in the chart that follows, both groups rated the importance of the domains similarly. As shown in the following table, Domain I (Feasibility Study) had the greatest difference in ratings. IMPORTANCE Domain

Survey

Panel

Difference

I.

Feasibility Study

3.03

1.69

1.34

II.

Definition

3.35

2.62

0.73

III.

System Design

3.50

3.54

-0.04

IV.

Development

2.99

3.92

-0.93

V.

Deployment

3.12

3.38

-0.26

VI.

Operation and Maintenance

2.58

2.54

0.04

The two groups rated the criticality of the domains similarly with Domain IV (Development) having the greatest difference (.58). CRITICALITY Domain

Survey

Panel

Difference

I.

Feasibility Study

2.43

1.77

0.66

II.

Definition

2.79

2.77

0.02

III.

System Design

3.32

3.31

0.01

IV.

Development

3.04

3.62

-0.58

V.

Deployment

3.21

3.69

-0.48

VI.

Operation and Maintenance

2.48

2.46

0.02

ISA Certified Automation Professional Job Analysis Study

24

The panelists and survey respondents also rated the frequency of the domains similarly. The greatest difference in the ratings was found in Domain I (Feasibility Study). FREQUENCY Domain

Survey

Panel

Difference

I.

Feasibility Study

10.29

6.21

4.08

II.

Definition

14.55

12.86

1.69

III.

System Design

27.06

25.36

1.70

IV.

Development

23.92

26.43

-2.51

V.

Deployment

14.37

17.86

-3.49

VI.

Operation and Maintenance

9.82

12.00

-2.18

E. Survey Respondent Subgroups’ Evaluations. When using a survey to collect information regarding a profession, the possibility that individuals in various settings have differing views of the profession is to be expected. Finding meaningful differences in domain or task ratings among the various subgroups might indicate that one should not generalize the survey results from one subgroup to another. With this in mind, the responses of specific subgroups were compared using the criterion that more than one unit of the four-point scale or 10 points on the frequency scale would indicate the possibility of meaningful difference if any of the calculated values was lower than the scale mid-point. Subgroups were defined by age, level of experience, time spent working as an automation professional in current position, control areas worked in on a daily basis, area of responsibility, employer, and highest level of education. Although three between-group differences were slightly greater than ten points on the frequency scale, the importance and criticality means for the domain ratings were within one scale point for each comparison. Consequently, the mean responses of the various subgroups do not vary to a practical extent, indicating general agreement between and among the different subgroups of participants. The following charts illustrate the similarities in means, or averages, for the responses of subgroups of respondents. Only minor variations occur between the responses. The similarity in the ratings provides support for generalizing from the survey results to the general population of qualified automation professionals.

ISA Certified Automation Professional Job Analysis Study

25

AGE IMPORTANCE Domain

Under 30 years

31-40 years

41-50 years

51-60 years

61 years and above

I.

Feasibility Study

2.75

3.01

3.04

3.18

**

II.

Definition

2.75

3.33

3.41

3.47

**

III.

System Design

3.75

3.54

3.39

3.53

**

IV.

Development

3.25

3.04

3.04

2.88

**

V.

Deployment

3.00

3.06

3.18

3.24

**

VI.

Operation and Maintenance

2.67

2.49

2.65

2.53

**

CRITICALITY Domain

Under 30 years

31-40 years

41-50 years

51-60 years

61 years and above

I.

Feasibility Study

2.23

2.52

2.37

2.47

**

II.

Definition

2.38

2.84

2.76

2.88

**

III.

System Design

3.46

3.29

3.28

3.35

**

IV.

Development

3.00

3.08

3.07

3.06

**

V.

Deployment

3.15

3.17

3.29

3.18

**

VI.

Operation and Maintenance

2.69

2.40

2.57

2.50

**

FREQUENCY Domain

Under 30 years

31-40 years

41-50 years

51-60 years

61 years and above

I.

Feasibility Study

11.15

10.27

9.94

10.88

**

II.

Definition

11.54

14.14

14.77

15.39

**

III.

System Design

22.69

26.03

26.12

29.39

**

IV.

Development

25.38

26.80

22.78

22.42

**

V.

Deployment

18.46

14.20

14.98

12.88

**

VI.

Operation and Maintenance

10.77

8.57

11.44

9.03

**

**Sample size is insufficient to support conclusions.

ISA Certified Automation Professional Job Analysis Study

26

LEVEL OF EXPERIENCE IMPORTANCE Domain

Not an AP

Less than 1 year

1-5 years

6-10 years

11-15 years

More than 15 years

I.

Feasibility Study

**

**

3.21

2.89

3.00

3.10

II.

Definition

**

**

3.16

3.20

3.38

3.45

III.

System Design

**

**

3.58

3.54

3.51

3.48

IV.

Development

**

**

2.84

3.13

3.13

2.88

V.

Deployment

**

**

3.00

3.15

3.22

3.10

VI.

Operation and Maintenance

**

**

2.63

2.54

2.73

2.49

CRITICALITY Domain Not an AP

Less than 1 year

1-5 years

6-10 years

11-15 years

More than 15 years

I.

Feasibility Study

**

**

2.63

2.22

2.50

2.44

II.

Definition

**

**

2.53

2.57

2.93

2.84

III.

System Design

**

**

3.26

3.20

3.24

3.40

IV.

Development

**

**

2.89

3.07

3.02

3.06

V.

Deployment

**

**

3.05

3.28

3.20

3.24

VI.

Operation and Maintenance

**

**

2.79

2.54

2.46

2.39

FREQUENCY Domain

Not an AP

Less than 1 year

1-5 years

6-10 years

11-15 years

More than 15 years

I.

Feasibility Study

**

**

13.06

7.64

11.49

10.22

II.

Definition

**

**

11.28

12.82

15.20

15.70

III.

System Design

**

**

26.67

26.05

25.80

27.92

IV.

Development

**

**

19.83

28.11

24.60

22.77

V.

Deployment

**

**

18.56

15.07

14.13

13.55

VI.

Operation and Maintenance

**

**

10.61

10.32

8.82

9.84

**Sample size is insufficient to support conclusions.

ISA Certified Automation Professional Job Analysis Study

27

TIME SPENT IMPORTANCE Domain

Not an AP

Less than 25 percent

25-50 percent

51-75 percent

76-100 percent

I.

Feasibility Study

**

**

3.15

2.95

3.04

II.

Definition

**

**

3.33

3.43

3.32

III.

System Design

**

**

3.48

3.65

3.47

IV.

Development

**

**

2.78

2.86

3.07

V.

Deployment

**

**

3.26

2.89

3.16

VI.

Operation and Maintenance

**

**

2.78

2.57

2.54

FREQUENCY Domain

Not an AP

Less than 25 percent

25-50 percent

51-75 percent

76-100 percent

I.

Feasibility Study

**

**

2.56

2.35

2.42

II.

Definition

**

**

2.89

2.62

2.78

III.

System Design

**

**

3.11

3.27

3.35

IV.

Development

**

**

2.89

2.92

3.12

V.

Deployment

**

**

3.19

3.19

3.24

VI.

Operation and Maintenance

**

**

2.59

2.54

2.46

CRITICALITY Domain

Not an AP

Less than 25 percent

25-50 percent

51-75 percent

76-100 percent

I.

Feasibility Study

**

**

13.20

10.61

9.58

II.

Definition

**

**

16.00

13.03

14.61

III.

System Design

**

**

29.80

28.28

25.68

IV.

Development

**

**

19.20

25.47

24.93

V.

Deployment

**

**

14.12

12.78

15.01

VI.

Operation and Maintenance

**

**

7.68

9.83

10.20

**Sample size is insufficient to support conclusions.

ISA Certified Automation Professional Job Analysis Study

28

CONTROL AREA(S) IMPORTANCE Domain

Discrete

Process

Both

I.

Feasibility Study

2.63

3.00

3.10

II.

Definition

3.31

3.24

3.39

III.

System Design

3.56

3.48

3.49

IV.

Development

3.44

2.93

2.98

V.

Deployment

3.00

3.13

3.13

VI.

Operation and Maintenance

2.38

2.46

2.64

CRITICALITY Domain

Discrete

Process

Both

I.

Feasibility Study

2.13

2.40

2.48

II.

Definition

2.81

2.66

2.83

III.

System Design

3.44

3.23

3.32

IV.

Development

3.13

3.00

3.06

V.

Deployment

2.94

3.26

3.24

VI.

Operation and Maintenance

1.88

2.62

2.52

FREQUENCY Domain

Automation Engineer

Controls Engineer

Other

I.

Feasibility Study

7.63

11.28

10.24

II.

Definition

16.56

15.11

14.21

III.

System Design

28.75

25.30

27.16

IV.

Development

27.38

22.52

24.19

V.

Deployment

12.94

15.15

14.32

VI.

Operation and Maintenance

6.75

10.63

9.90

ISA Certified Automation Professional Job Analysis Study

29

AREA OF RESPONSIBILITY IMPORTANCE Domain

Field Engineering

Information Systems

Operations and Maintenance

Project/ Systems Engineering

Other

I.

Feasibility Study

**

**

2.96

3.01

3.16

II.

Definition

**

**

3.33

3.36

3.44

III.

System Design

**

**

3.38

3.53

3.40

IV.

Development

**

**

2.67

3.09

2.72

V.

Deployment

**

**

2.96

3.16

2.88

VI.

Operation and Maintenance

**

**

2.83

2.51

2.24

CRITICALITY Domain

Field Engineering

Information Systems

Operations and Maintenance

Project/ Systems Engineering

Other

I.

Feasibility Study

**

**

2.18

2.33

2.72

II.

Definition

**

**

2.68

2.75

2.84

III.

System Design

**

**

2.86

3.32

3.56

IV.

Development

**

**

2.64

3.11

3.08

V.

Deployment

**

**

3.05

3.23

3.12

VI.

Operation and Maintenance

**

**

2.82

2.37

2.44

FREQUENCY Domain

Field Engineering

Information Systems

Operations and Maintenance

Project/ Systems Engineering

Other

I.

Feasibility Study

**

**

9.65

9.48

13.04

II.

Definition

**

**

13.17

13.96

17.83

III.

System Design

**

**

25.43

27.29

28.04

IV.

Development*

**

**

15.87

26.58

21.30

V.

Deployment

**

**

15.96

14.27

12.39

VI.

Operation and Maintenance*

**

**

19.91

8.44

7.39

*Differences greater than 10 percentage points exist. **Sample size is insufficient to support conclusions. ISA Certified Automation Professional Job Analysis Study

30

EMPLOYER IMPORTANCE Domain

Control Systems Suppliers

EndUsers

Engineering and Design Firm

OEM

Systems Integrators

Other

I.

Feasibility Study

3.20

2.98

3.07

2.82

3.13

3.23

II.

Definition

3.53

3.38

3.20

3.36

3.36

3.46

III.

System Design

3.47

3.45

3.59

3.41

3.59

3.38

IV.

Development

2.67

2.90

3.07

2.95

3.33

2.92

V.

Deployment

2.80

3.10

3.20

2.95

3.26

3.31

VI.

Operation and Maintenance

2.67

2.75

2.27

2.36

2.49

3.08

CRITICALITY Domain

Control Systems Suppliers

EndUsers

Engineering and Design Firm

OEM

Systems Integrators

Other

I.

Feasibility Study

2.93

2.21

2.59

2.59

2.41

2.46

II.

Definition

3.00

2.68

2.84

2.86

2.87

2.69

III.

System Design

3.40

3.23

3.32

3.36

3.46

3.15

IV.

Development

2.67

3.01

3.16

3.09

3.18

2.92

V.

Deployment

2.93

3.31

3.34

2.82

3.26

3.15

VI.

Operation and Maintenance

2.13

2.75

2.41

2.09

2.28

2.77

FREQUENCY Domain

Control Systems Suppliers

EndUsers

Engineering and Design Firm

OEM

Systems Integrators

Other

I.

Feasibility Study

16.00

10.49

8.88

9.64

9.47

10.17

II.

Definition

19.00

14.46

13.72

15.59

12.56

16.92

III.

System Design

25.67

25.20

30.12

29.23

26.89

23.75

IV.

Development*

18.33

22.11

23.84

24.32

32.42

20.00

V.

Deployment

13.33

14.88

15.02

14.91

12.56

15.00

VI.

Operation and Maintenance

7.67

12.86

8.47

6.32

6.11

14.17

. *Differences greater than 10 percentage points exist.

ISA Certified Automation Professional Job Analysis Study

31

HIGHEST LEVEL OF EDUCATION IMPORTANCE Domain

High school

Associate Degree

Bachelor’s degree

Master’s degree

Doctoral degree

Other

I.

Feasibility Study

3.21

2.86

2.95

3.33

**

**

II.

Definition

3.21

3.41

3.32

3.44

**

**

III.

System Design

3.29

3.55

3.53

3.44

**

**

IV.

Development

2.79

3.00

3.10

2.72

**

**

V.

Deployment

2.79

3.27

3.14

3.00

**

**

VI.

Operation and Maintenance

2.57

2.64

2.57

2.50

**

**

CRITICALITY Domain

High school

Associate Degree

Bachelor’s degree

Master’s degree

Doctoral degree

Other

I.

Feasibility Study

2.13

2.32

2.41

2.67

**

**

II.

Definition

2.60

2.86

2.80

2.81

**

**

III.

System Design

3.07

3.41

3.33

3.28

**

**

IV.

Development

2.60

2.95

3.14

2.92

**

**

V.

Deployment

2.93

3.14

3.30

2.97

**

**

VI.

Operation and Maintenance

2.33

2.18

2.48

2.61

**

**

FREQUENCY Domain

High school

Associate Degree

Bachelor’s degree

Master’s degree

Doctoral degree

Other

I.

Feasibility Study

10.00

9.36

9.48

12.77

**

**

II.

Definition

14.67

14.59

13.79

16.43

**

**

III.

System Design

32.33

30.59

25.34

29.43

**

**

IV.

Development

20.67

20.91

26.69

19.09

**

**

V.

Deployment

13.67

16.59

14.33

12.86

**

**

VI.

Operation and Maintenance

8.67

7.95

10.38

9.43

**

**

**Sample size is insufficient to support conclusions.

ISA Certified Automation Professional Job Analysis Study

32

IV. Reliability Analysis for Domain Scales CASTLE assessed the reliability of the scales in order to determine how consistently the tasks measured the domains of interest. Reliability refers to the degree to which tests or surveys are free from measurement error. It is important to understand the consistency of the data along the importance and criticality dimensions in order to draw defensible conclusions. With inconsistency (i.e., unreliability), it would be impossible to reach accurate conclusions. Reliability was estimated as internal consistency (Cronbach’s Alpha) using the respondents’ ratings of importance and criticality for each domain. This calculates the extent to which each task rating within each domain consistently measures what other tasks within that domain measure. Reliability coefficients range from 0 to 1 and should be above 0.7 to be judged as adequate. Reliability values below 0.7 indicate an unacceptable amount of measurement error. As shown below, all domains easily exceed this critical value.

RELIABILITY Domain

Importance

Criticality

Frequency

I.

Feasibility Study

.8464

.8653

.8870

II.

Definition

.8234

.8437

.8832

III.

System Design

.9014

.8981

.9251

IV.

Development

.9169

.8954

.9352

V.

Deployment

.9228

.9210

.9639

VI.

Operation and Maintenance

.9334

.9259

.9104

V. Delineation of Required Knowledge and Skills Working under the direction of CASTLE, the panel of experts developed a comprehensive list of the knowledge and skills that the qualified automation professional must possess in order to provide competent service in each task area. Members of the expert panel drafted these lists at the time that the panel reached consensus on the tasks. CASTLE then circulated the list throughout the panel of experts and collected revisions and editorial suggestions for each list from the entire panel. Following the meeting, CASTLE and ISA arranged for a special committee to review the lists online using software designed for that purpose in combination with a series of conference calls. CASTLE facilitated the review meetings, which led to the final listing presented in Phase III of this report. It is useful when conducting a job analysis in connection with the content validation of a credentialing examination to understand that knowledge is normally considered a matter of the cognitive domain (Bloom, et al., 1956). Within the cognitive domain, predominating taxonomies use different levels to describe the learning outcomes desired. For a credentialing examination such as the CAP, the most common levels are knowledge (which includes recall and comprehension), application, and analysis. Knowledge refers to the remembering of previously learned subject matter and a grasp of its meaning. Application is the ability to use subject matter in job-related situations, and analysis refers to the ability to break subject matter into component parts in order to reveal its organization and structure. Skills may be psychomotor or they may involve cognitive skills, such as critical thinking. The CAP examination should target the objective of having questions with each cognitive domain.

ISA Certified Automation Professional Job Analysis Study

33

VI. Summary of Results As shown in the charts on the preceding pages, the survey respondents indicated that all domains are very important. Each of the six domains has an average importance of at least 2.58 on the four-point rating scale, with 2 being “Moderately Important” and 3 being “Very Important.” Similarly, the respondents considered all domains to be critical. Each of the six domains has an average criticality rating of at least 2.43 on the four-point scale, which means that incompetent performance of tasks in each domain could result in “Moderate Harm” to “Substantial Harm” (of some type) to the public. It is of further value to note that the panel of experts and survey respondents agreed on the average ratings for importance and criticality of domains, with only one difference greater than one scale point. These data support the validity of the six domains as major categories of responsibility in the practice of automation. Of interest in the analysis was the possibility that respondents’ status along biographical dimensions might influence their views about the practice of automation. All subgroups rated the domains within one scale point or ten points on the frequency scale with the exception of three cases. In these three instances, the highest between-group difference exceeded the lowest by greater than ten points on the frequency scale. Two of these differences occurred when the area of primary responsibility was examined. Differences were found in the ratings of frequency for Domain IV (Development) and Domain VI (Operation and Maintenance). These differences were not unexpected as those respondents who reported working in Operations and Maintenance as their primary area of responsibility reported spending 12.52 percent more time performing duties associated with Domain V (Operations and Maintenance) than did those individuals who reported having another area of primary responsibility. Respondents reporting their primary area of responsibility as Project/Systems Engineering reported spending 10.71 percent more time in Domain IV (Development) than those individuals reporting their primary area of responsibility as Domain IV. However, no differences greater than one scale point were found on the Importance and Criticality ratings. The final difference was found when examining subgroup differences based on current employer’s company or organization. The respondents reporting their current employers were best described as System Integrators reported spending 14.09 percent more time in Domain IV (Development) than those reporting their current employers were best described as Control Systems Suppliers. However, no differences greater than one scale points were found on the Importance and Criticality ratings. Therefore, the differences observed were not considered meaningful in terms of influencing test specifications. VII. Conclusion The results of the job analysis survey validate the results of the panel of experts. This conclusion means that the domains and tasks developed by the job analysis panel constitute an accurate definition of the work of qualified automation professionals Based on a psychometric analysis of the tasks, knowledge, and skills identified by the job analysis study and given the depth of knowledge and skill implied for protection of the public, competence in the profession can best be assessed using a multiple-choice examination format.

ISA Certified Automation Professional Job Analysis Study

34

PHASE III TEST SPECIFICATIONS The final phase of a job analysis study is the development of test specifications which identify the proportion of questions from each domain and task that will appear on the CAP examination. Test specifications are developed by combining the overall evaluations of importance, frequency, and criticality, and converting the results into percentages. Importance, frequency, and criticality ratings were weighted equally in this computation. These percentages are used to determine the number of questions related to each domain and task. TEST BLUEPRINT Domain

% of Test

# of Items on Test

I.

Feasibility Study

11.60%

20

II.

Definition

15.23%

26

III.

System Design

24.94%

44

IV.

Development

22.04%

39

V.

Deployment

15.24%

27

VI.

Operation and Maintenance

10.95%

19

100.00

175

TOTAL

ISA Certified Automation Professional Job Analysis Study

35

DOMAINS, TASKS, AND KNOWLEDGE AND SKILL STATEMENTS This section of the report contains the domains, tasks, and knowledge and skill statements as delineated by the practice analysis panel of experts and validated with data from the practice analysis survey. Domain I.

Feasibility Study

Domain II.

Definition

Domain III.

System Design

Domain IV.

Development

Domain V.

Deployment

Domain VI.

Operation and Maintenance Performance Domain I: Feasibility Study

Evaluation and Allocation of Questions for Domain I RATINGS Task

Importance

Criticality

Frequency

% of Items on Test

# of Items on Test

1

2.68

2.18

1.80

1.96%

4

2

2.63

2.13

1.84

1.95%

3

3

2.84

2.37

1.88

2.09%

4

4

2.55

2.24

1.84

1.95%

3

5

2.39

2.00

1.61

1.77%

3

6

2.57

2.11

1.70

1.88%

3

11.60%

20

TOTAL

Domain I:

Feasibility Study

Task 1:

Define the preliminary scope through currently established work practices in order to meet the business need. Knowledge of: 1. Established work practices 2. Basic process and/or equipment 3. Project management methodology 4. Automation opportunity identification techniques (e.g., dynamic performance measures) 5. Control and information technologies (MES) and equipment

ISA Certified Automation Professional Job Analysis Study

36

Skill in: 1. 2. Task 2:

Determine the degree of automation required through cost/benefit analysis in order to meet the business need.

Knowledge of: 1. 2. 3. 4. Skill in: 1. 2. 3. Task 3:

Analyzing cost versus benefit (e.g., life cycle analysis) Choosing the degree of automation Estimating the cost of control equipment and software

Control strategies Principles of measurement Electrical components Control components Various degrees of automation Evaluating different control strategies Selecting appropriate measurements Selecting appropriate components Articulating concepts

Conduct technical studies for the preliminary automation strategy by gathering data and conducting an appropriate analysis relative to requirements in order to define development needs and risks.

Knowledge of: 1. 2. 3. Skill in: 1. 2. 3. Task 5:

Various degrees of automation Various cost/benefit tools Control and information technologies (MES) and equipment Information technology and equipment

Develop a preliminary automation strategy that matches the degree of automation required by considering an array of options and selecting the most reasonable option in order to prepare feasibility estimates.

Knowledge of: 1. 2. 3. 4. 5. Skill in: 1. 2. 3. 4. Task 4:

Automating process and/or equipment Developing value analyses

Process control theories Machine control theories and mechatronics Risk assessment techniques Conducting technical studies Conducting risk analyses Defining primary control strategies

Perform a justification analysis by generating a feasibility cost estimate and using an accepted financial model to determine project viability.

Knowledge of: 1. 2. 3. 4.

Financial models (e.g., ROI, NPV) Business drivers Costs of control equipment Estimating techniques

ISA Certified Automation Professional Job Analysis Study

37

Skill in: 1. 2. 3. Task 6:

Estimating the cost of the system Running the financial model Evaluating the results of the financial analysis for the automation portion of the project

Create a conceptual summary document by reporting preliminary decisions and assumptions in order to facilitate "go"/"no go" decision making.

Knowledge of: 1. Skill in: 1. 2. 3.

Conceptual summary outlines Writing in a technical and effective manner Compiling and summarizing information efficiently Presenting information

Performance Domain II: Definition Evaluation and Allocation of Questions for Domain II RATINGS Task

Importance

Criticality

Frequency

% of Items on Test

# of Items on Test

1

3.11

2.55

2.05

3.23%

5

2

2.60

2.18

1.89

2.79%

5

3

3.23

2.87

2.10

3.43%

6

4

2.69

2.23

1.89

2.85%

5

5

2.83

2.35

1.84

2.94%

5

15.23%

26

TOTAL

Domain II:

Definition

Task 1:

Determine operational strategies through discussion with key stakeholders and using appropriate documentation in order to create and communicate design requirements.

Knowledge of: 1. 2. 3. Skill in: 1. 2. 3.

Interviewing techniques Different operating strategies Team leadership and alignment Leading a individual or group discussion Communicating effectively Writing in a technical and effective manner

ISA Certified Automation Professional Job Analysis Study

38

4. 5. Task 2:

Building consensus Interpreting the data from interviews

Analyze alternative technical solutions by conducting detailed studies in order to define the final automation strategy.

Knowledge of: 1. 2. 3. 4. 5. 6.

Automation techniques Control theories Modeling and simulation techniques Basic control elements (e.g., sensors, instruments, actuators, control systems, drive systems, HMI, batch control, machine control) Marketplace products available Process and/or equipment operations

1. 2. 3. 4.

Applying and evaluating automation solutions Making intelligent decisions Using the different modeling tools Determining when modeling is needed

Skill in:

Task 3:

Establish detailed requirements and data including network architecture, communication concepts, safety concepts, standards, vendor preferences, instrument and equipment data sheets, reporting and information needs, and security architecture through established practices in order to form the basis of the design.

Knowledge of: 1. 2. 3. 4. 5. 6. 7. Skill in: 1. 2. 3. 4. 5. 6. Task 4:

Network architecture Communication protocols, including field level Safety concepts Industry standards and codes Security requirements Safety standards (e.g., ISAM, ANSI, NFPA) Control systems security practices Conducting safety analyses Determining which data is important to capture Selecting applicable standards and codes Identifying new guidelines that need to be developed Defining information needed for reports Completing instrument and equipment data sheets

Generate a project cost estimate by gathering cost information in order to determine continued project viability.

Knowledge of: 1. 2. 3. Skill in: 1. 2.

Control system costs Estimating techniques Available templates and tools Creating cost estimate Evaluating project viability

ISA Certified Automation Professional Job Analysis Study

39

Task 5:

Summarize project requirements by creating a basis-of-design document and a userrequirements document in order to launch the design phase.

Knowledge of: 1. 2. Skill in: 1. 2. 3.

Basis of design outlines User-requirements document outlines Writing in a technical and effective manner Compiling and summarizing information Making effective presentations

ISA Certified Automation Professional Job Analysis Study

40

Performance Domain III: System Design Evaluation and Allocation of Questions for Domain III RATINGS Task

Importance

Criticality

Frequency

% of Items on Test

# of Items on Test

1

3.31

3.26

2.16

3.15%

5

2

2.83

2.42

1.98

2.61%

5

3

3.04

2.69

2.22

2.87%

5

4

2.80

2.41

2.02

2.61%

5

5

2.79

2.45

1.97

2.60%

4

6

3.13

2.76

2.21

2.92%

5

7

2.86

2.59

1.97

2.68%

5

8

2.97

2.66

2.26

2.85%

5

9

2.80

2.59

2.01

2.67%

5

24.94%

44

TOTAL

Domain III:

System Design

Task 1:

Perform safety and/or hazard analyses, security analyses, and regulatory compliance assessments by identifying key issues and risks in order to comply with applicable standards, policies, and regulations.

Knowledge of: 1. 2. 3.

Applicable standards (e.g., ISA S84, IEC 61508, 21 CFR Part 11, NFPA) Environmental standards (EPA) Electrical, electrical equipment, enclosure, and electrical classification standards (e.g., UL/FM, NEC, NEMA)

Skill in: 1. 2. 3. 4. 5.

Participating in a Hazard Operability Review Analyzing safety integrity levels Analyzing hazards Assessing security requirements or relevant security issues Applying regulations to design

ISA Certified Automation Professional Job Analysis Study

41

Task 2:

Establish standards, templates, and guidelines as applied to the automation system using the information gathered in the definition stage and considering human-factor effects in order to satisfy customer design criteria and preferences.

Knowledge of: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. Skill in: 1. 2. 3. 4. Task 3:

Developing programming standards Selecting and sizing instrument equipment Designing low-voltage electrical systems Preparing drawing using AutoCAD software

Create detailed equipment specifications and instrument data sheets based on vendor selection criteria, characteristics and conditions of the physical environment, regulations, and performance requirements in order to purchase equipment and support system design and development.

Knowledge of: 1. 2. 3. 4. 5. 6. Skill in: 1. 2. 3. 4. 5. 6. 7. Task 4:

Process Industry Practices (PIP) (Construction Industry Institute) IEC 61131 programming languages Customer standards Vendor standards Template development methodology Field devices Control valves Electrical standards (NEC) Instrument selection and sizing tools ISA standards (e.g., S88)

Field devices Control valves Electrical standards (NEC) Instrument selection and sizing tools Vendors' offerings Motor and drive selection sizing tools Selecting and sizing motors and drives Selecting and sizing instrument equipment Designing low-voltage electrical systems Selecting and sizing computers Selecting and sizing control equipment Evaluating vendor alternatives Selecting or sizing of input/output signal devices and/or conditioners

Define the data structure layout and data flow model considering the volume and type of data involved in order to provide specifications for hardware selection and software development.

Knowledge of: 1. 2. 3. 4. 5.

Data requirements of system to be automated Data structures of control systems Data flow of controls systems Productivity tools and software (e.g., InTools, AutoCAD) Entity relationship diagrams

ISA Certified Automation Professional Job Analysis Study

42

Skill in: 1. 2. Task 5:

Select the physical communication media, network architecture, and protocols based on data requirements in order to complete system design and support system development.

Knowledge of: 1. 2. 3. 4. 5. 6. 7. 8. Skill in: 1. Task 6:

Vendor protocols Ethernet and other open networks (e.g., Devicenet) Physical requirements for networks/media Physical topology rules/limitations Network design Security requirements Backup practices Grounding and bonding practices Designing networks based on chosen protocols

Develop a functional description of the automation solution (e.g., control scheme, alarms, HMI, reports) using rules established in the definition stage in order to guide development and programming.

Knowledge of: 1. 2. 3. 4. 5. 6. 7. Skill in: 1. 2. 3. Task 7:

Modeling data Tuning and normalizing databases

Control theory Visualization, alarming, database/reporting techniques Documentation standards Vendors' capabilities for their hardware and software products General control strategies used within the industry Process/equipment to be automated Operating philosophy Writing functional descriptions Interpreting design specifications and user requirements Communicating the functional description to stakeholders

Design the test plan using chosen methodologies in order to execute appropriate testing relative to functional requirements.

Knowledge of: 1. 2. 3. 4. 5. Skill in: 1. 2.

Relevant test standards Simulation tools Process Industry Practices (PIP) (Construction Industry Institute) General software testing procedures Functional description of the system/equipment to be automated Writing test plans Developing tests that validate that the system works as specified

ISA Certified Automation Professional Job Analysis Study

43

Task 8:

Perform the detailed design for the project by converting the engineering and system design into purchase requisitions, drawings, panel designs, and installation details consistent with the specification and functional descriptions in order to provide detailed information for development and deployment.

Knowledge of: 1. 2. 3. 4. 5. 6. 7.

Field devices, control devices, visualization devices, computers, and networks Installation standards and recommended practices Electrical and wiring practices Specific customer preferences Functional requirements of the system/equipment to be automated Applicable construction codes Documentation standards

1. 2.

Performing detailed design work Documenting the design

Skill in:

Task 9:

Prepare comprehensive construction work packages by organizing the detailed design information and documents in order to release project for construction.

Knowledge of: 1. 2. Skill in: 1.

Applicable construction practices Documentation standards Assembling construction work packages

ISA Certified Automation Professional Job Analysis Study

44

Performance Domain IV: Development Evaluation and Allocation of Questions for Domain IV RATINGS Task

Importance

Criticality

Frequency

% of Items on Test

# of Items on Test

1

2.99

2.61

2.33

2.82%

5

2

2.75

2.35

2.18

2.59%

4

3

3.23

3.08

2.56

3.15%

6

4

2.78

2.53

2.09

2.63%

5

5

3.00

2.87

1.95

2.78%

5

6

2.95

2.65

2.13

2.75%

5

7

3.17

2.91

2.11

2.91%

5

8

2.73

2.22

1.85

2.42%

4

22.04%

39

TOTAL

Domain IV:

Development

Task 1:

Develop Human Machine Interface (HMI) in accordance with the design documents in order to meet the functional requirements.

Knowledge of 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11.

Specific HMI software products Tag definition schemes Programming structure techniques Network communications Alarming schemes Report configurations Presentation techniques Database fundamentals Computer operating systems Human factors HMI supplier options

1. 2. 3. 4. 5.

Presenting data in a logical and aesthetic fashion Creating intuitive navigation menus Implementing connections to remote devices Documenting configuration and programming Programming configurations

Skill in:

ISA Certified Automation Professional Job Analysis Study

45

Task 2:

Develop database and reporting functions in accordance with the design documents in order to meet the functional requirements.

Knowledge of: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. Skill in: 1. 2. 3. 4. 5. 6. 7. 8. Task 3:

Relational database theory Specific database software products Specific reporting products Programming/scripting structure techniques Network communications Structured Query language Report configurations Entity diagram techniques Computer operating systems Data mapping Presenting data in a logical and aesthetic fashion Administrating databases Implementing connections to remote applications Writing queries Creating reports and formatting/printing specifications for report output Documenting database configuration Designing databases Interpreting functional description

Develop control configuration or programming in accordance with the design documents in order to meet the functional requirements.

Knowledge of: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. Skill in: 1. 2. 3. 4. 5. 6. 7.

Specific control software products Tag definition schemes Programming structure techniques Network communications Alarming schemes I/O structure Memory addressing schemes Hardware configuration Computer operating systems Processor capabilities Standard nomenclature (e.g., ISA) Process/equipment to be automated Interpreting functional description Interpreting control strategies and logic drawings Programming and/or configuration capabilities Implementing connections to remote devices Documenting configuration and programs Interpreting P&IDs Interfacing systems

ISA Certified Automation Professional Job Analysis Study

46

Task 4:

Implement data transfer methodology that maximizes throughput and ensures data integrity using communication protocols and specifications in order to assure efficiency and reliability.

Knowledge of: 1. 2. 3. 4. 5. 6. 7. Skill in: 1. 2. 3. 4. 5. 6. 7. Task 5:

Analyzing throughput Ensuring data integrity Troubleshooting Documenting configuration Configuring network products Interfacing systems Manipulating data

Implement security methodology in accordance with stakeholder requirements in order to mitigate loss and risk.

Knowledge of: 1. 2. 3. 4. 5. Skill in: 1. 2. 3. Task 6:

Specific networking software products (e.g., I/O servers). Network topology Network protocols Physical media specifications (e.g., copper, fiber, RF, IR) Computer operating systems Interfacing and gateways Data mapping

Basic system/network security techniques Customer security procedures Control user-level access privileges Regulatory expectations (e.g., 29 CFR Part 11) Industry standards (e.g., ISA) Documenting security configuration Configuring/programming of security system Implementing security features

Review configuration and programming using defined practices in order to establish compliance with functional requirements.

Knowledge of: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13.

Specific control software products Specific HMI software products Specific database software products Specific reporting products Programming structure techniques Network communication Alarming schemes I/O structure Memory addressing schemes Hardware configurations Computer operating systems Defined practices Functional requirements of system/equipment to be automated

ISA Certified Automation Professional Job Analysis Study

47

Skill in: 1. 2. 3. Task 7:

Test the automation system using the test plan in order to determine compliance with functional requirements.

Knowledge of: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. Skill in: 1. 2. 3. 4. 5. 6. 7. Task 8:

Programming and/or configuration capabilities Documenting configuration and programs Reviewing programming/configuration for compliance with design requirements

Testing techniques Specific control software products Specific HMI software products Specific database software products Specific reporting products Network communications Alarming schemes I/O structure Memory addressing schemes Hardware configurations Computer operating systems Functional requirements of system/equipment to be automated Writing test plans Executing test plans Documenting test results Programming and/or configuration capabilities Implementing connections to remote devices Interpreting functional requirements of system/equipment to be automated Interpreting P&IDs

Assemble all required documentation and user manuals created during the development process in order to transfer essential knowledge to customers and end users.

Knowledge of: 1. 2. 3. 4. 5. Skill in: 1. 2. 3.

General understanding of automation systems Computer operating systems Documentation practices Operations procedures Functional requirements of system/equipment to be automated Documenting technical information for non-technical audience Using documentation tools Organizing material for readability

ISA Certified Automation Professional Job Analysis Study

48

Performance Domain V: Deployment Evaluation and Allocation of Questions for Domain V RATINGS Task

Importance

Criticality

Frequency

% of Items on Test

# of Items on Test

1

2.75

2.49

2.05

1.16%

2

2

2.86

2.74

2.14

1.24%

2

3

2.82

2.46

2.11

1.18%

2

4

3.34

3.01

2.51

1.41%

3

5

3.18

2.96

2.29

1.34%

3

6

2.99

2.74

2.16

1.26%

2

7

3.52

3.51

2.23

1.48%

3

8

3.02

2.83

1.99

1.25%

2

9

2.87

2.52

1.81

1.15%

2

10

3.11

2.89

2.01

1.28%

2

11

3.07

2.74

2.21

1.28%

2

12

2.97

2.58

2.06

1.21%

2

15.24%

27

TOTAL

Domain V:

Deployment

Task 1:

Perform receipt verification of all field devices by comparing vendor records against design specifications in order to ensure that devices are as specified.

Knowledge of: 1. 2.

Field devices (e.g., transmitters, final control valves, controllers, variable speed drives, servo motors) Design specifications

1. 2.

Interpreting specifications and vendor documents Resolving differences

Skill in:

ISA Certified Automation Professional Job Analysis Study

49

Task 2:

Perform physical inspection of installed equipment against construction drawings in order to ensure installation in accordance with design drawings and specifications.

Knowledge of: 1. 2. 3. Skill in: 1. 2. 3. 4. Task 3:

Control system (e.g., PLC, DCS, PC) System administration Installing software Verifying software installation Versioning techniques and revision control Troubleshooting (i.e., resolving issues and retesting)

Solve unforeseen problems identified during installation using troubleshooting skills in order to correct deficiencies.

Knowledge of: 1. 2. 3. 4. 5. Skill in: 1. 2. 3. 4. 5. 6. Task 5:

Interpreting construction drawings Comparing physical implementation to drawings Interpreting codes and regulations (e.g., NEC, building codes, OSHA) Interpreting installation guidelines

Install configuration and programs by loading them into the target devices in order to prepare for testing.

Knowledge of: 1. 2. Skill in: 1. 2. 3. 4. Task 4:

Construction documentation Installation practices (e.g., field devices, computer hardware, cabling) Applicable codes and regulations

Troubleshooting techniques Problem-solving strategies Critical thinking Processes, equipment, configurations, and programming Debugging techniques Solving problems Determining root causes Ferreting out information Communicating with facility personnel Implementing problem solutions Documenting problems and solutions

Test configuration and programming in accordance with the design documents by executing the test plan in order to verify that the system operates as specified.

Knowledge of: 1. 2. 3. 4. 5.

Programming and configuration Test methodology (e.g., factory acceptance test, site acceptance test, unit-level testing, system-level testing) Test plan for the system/equipment to be automated System to be tested Applicable regulatory requirements relative to testing

ISA Certified Automation Professional Job Analysis Study

50

Skill in: 1. 2. 3. 4. Task 6:

Test communication systems and field devices in accordance with design specifications in order to ensure proper operation.

Knowledge of: 1. 2. 3. 4. Skill in: 1. 2. 3. 4. 5. 6. Task 7:

Test methodology Communication networks and protocols Field devices and their performance requirements Regulatory requirements relative to testing Verifying network integrity and data flow integrity Conducting field device tests Comparing test results to design specifications Documenting test results Troubleshooting (i.e., resolving issues and retesting) Writing test plans

Test all safety elements and systems by executing test plans in order to ensure that safety functions operate as designed.

Knowledge of: 1. 2. 3. 4. 5. 6. Skill in: 1. 2. 3. 4. 5. Task 8:

Executing test plans Documenting test results Troubleshooting (i.e., resolving issues and retesting) Writing test plans

Applicable safety Safety system design Safety elements Test methodology Facility safety procedures Regulatory requirements relative to testing Executing test plans Documenting test results Testing safety systems Troubleshooting (i.e., resolving issues and retesting) Writing test plans

Test all security features by executing test plans in order to ensure that security functions operate as designed.

Knowledge of: 1. 2. 3. 4. 5. Skill in: 1. 2. 3. 4. 5.

Applicable security standards Security system design Test methodology Vulnerability assessments Regulatory requirements relative to testing Executing test plans Documenting test results Testing security features Troubleshooting (i.e., resolving issues and retesting) Writing test plans

ISA Certified Automation Professional Job Analysis Study

51

Task 9:

Provide initial training for facility personnel in system operation and maintenance through classroom and hands-on training in order to ensure proper use of the system.

Knowledge of: 1. 2. 3. 4. 6. 5. Skill in: 1. 2. 3. Task 10:

Communicating with trainees Organizing instructional materials Instructing

Execute system-level tests in accordance with the test plan in order to ensure the entire system functions as designed.

Knowledge of: 1. 2. 3. 4. 5. 6. 7. Skill in: 1. 2. 3. 4. 5. 6. Task 11:

Instructional techniques Automation systems Networking and data communications Automation maintenance techniques System/equipment to be automated Operating and maintenance procedures

Test methodology Field devices System/equipment to be automated Networking and data communications Safety systems Security systems Regulatory requirements relative to testing Executing test plans Documenting test results Testing of entire systems Communicating final results to facility personnel Troubleshooting (i.e., resolving issues and retesting) Writing test plans

Troubleshoot problems identified during testing using a structured methodology in order to correct system deficiencies.

Knowledge of: 1. 2. Skill in: 1. 2. 3. 4. 5.

Troubleshooting techniques Processes, equipment, configurations, and programming Solving problems Determining root causes Communicating with facility personnel Implementing problem solutions Documenting test results

ISA Certified Automation Professional Job Analysis Study

52

Task 12:

Make necessary adjustments using applicable tools and techniques in order to demonstrate system performance and turn the automated system over to operations.

Knowledge of: 1. 2. 3. 4. 5. Skill in: 1. 2. 3. 4.

Loop tuning methods/control theory Control system hardware Computer system performance tuning User requirements System/equipment to be automated Tuning control loops Adjusting final control elements Optimizing software performance Communicating final system performance results

ISA Certified Automation Professional Job Analysis Study

53

Performance Domain VI: Operation and Maintenance Evaluation and Allocation of Questions for Domain VI

RATINGS Task

Importance

Criticality

Frequency

% of Items on Test

# of Items on Test

1

2.39

2.10

1.65

0.91%

2

2

2.76

2.26

2.15

1.06%

2

3

2.17

1.91

1.39

0.81%

1

4

2.39

2.05

1.42

0.87%

1

5

2.41

2.13

1.89

0.95%

2

6

2.20

1.94

1.56

0.84%

1

7

2.39

1.88

1.81

0.90%

2

8

2.47

1.78

1.44

0.84%

1

9

2.33

1.87

1.34

0.82%

1

10

2.34

2.16

1.45

0.88%

2

11

2.66

2.49

1.62

1.00%

2

12

2.77

2.51

1.95

1.07%

2

10.95%

19

TOTAL

Domain VI:

Operation and Maintenance

Task 1:

Verify system performance and records periodically using established procedures in order to ensure compliance with standards, regulations, and best practices.

Knowledge of: 1. 2. 3. 4. Skill in: 1. 2. 3. Task 2:

Applicable standards Performance metrics and acceptable limits Records and record locations Established procedures and purposes of procedures Communicating orally and written Auditing the system/equipment Analyzing data and drawing conclusions

Provide technical support for facility personnel by applying system expertise in order to maximize system availability.

ISA Certified Automation Professional Job Analysis Study

54

Knowledge of: 1. 2. 3. 4. 5. 6. Skill in: 1. 2. 3. Task 3:

Personnel training requirements Automation system technology Assessment frequency Assessment methodologies Interviewing Assessing level of skills

Provide training for facility personnel by addressing identified objectives in order to ensure the skill level of personnel is adequate for the technology and products used in the system.

Knowledge of: 1. 2. 3. Skill in: 1. 2. 3. 4. 5. Task 5:

Troubleshooting (i.e., resolving issues and retesting) Investigating and listening Programming and configuring automation system components

Perform training needs analysis periodically for facility personnel using skill assessments in order to establish objectives for the training program.

Knowledge of: 1. 2. 3. 4. Skill in: 1. 2. Task 4:

All system components Processes and equipment Automation system functionality Other support resources Control systems theories and applications Analytical troubleshooting and root-cause analyses

Training resources Subject matter and training objectives Teaching methodology Writing training objectives Creating the training Organizing training classes (e.g., securing demos, preparing materials, securing space) Delivering training effectively Answering questions effectively

Monitor performance using software and hardware diagnostic tools in order to support early detection of potential problems.

Knowledge of: 1. 2. 3. 4. 5. 6.

Automation systems Performance metrics Software and hardware diagnostic tools Potential problem indicators Baseline/normal system performance Acceptable performance limits

ISA Certified Automation Professional Job Analysis Study

55

Skill in: 1. 2. 3. Task 6:

Perform periodic inspections and tests in accordance with written standards and procedures in order to verify system or component performance against requirements.

Knowledge of: 1. 2. 3. Skill in: 1. 2. 3. Task 7:

Performance requirements Inspection and test methodologies Acceptable standards Testing and inspecting Analyzing test results Communicating effectively with others in written or oral form

Perform continuous improvement by working with facility personnel in order to increase capacity, reliability, and/or efficiency.

Knowledge of: 1. 2. 3. 4. 5. Skill in: 1. 2. 3. 4. Task 8:

Using the software and hardware diagnostic tools Analyzing data Troubleshooting (i.e., resolving issues and retesting)

Performance metrics Control theories System/equipment operations Business needs Optimization tools and methods Analyzing data Programming and configuring Communicating effectively with others Implementing continuous improvement procedures

Document lessons learned by reviewing the project with all stakeholders in order to improve future projects.

Knowledge of: 1. 2. 3. 4. Skill in: 1. 2. 3. 4.

Project review methodology Project history Project methodology and work processes Project metrics Communicating effectively with others Configuring and programming Documenting lessons learned Writing and summarizing

ISA Certified Automation Professional Job Analysis Study

56

Task 9:

Maintain licenses, updates, and service contracts for software and equipment by reviewing both internal and external options in order to meet expectations for capability and availability.

Knowledge of: 1. 2. 3. 4.

Installed base of system equipment and software Support agreements Internal and external support resources Lifecycle state and support level (including vendor product plans and future changes)

Skill in: 1. 2. 3. Task 10:

Determine the need for spare parts based on an assessment of installed base and probability of failure in order to maximize system availability and minimize cost.

Knowledge of: 1. 2. 3. 4. 5. Skill in: 1. 2. Task 11:

Organizing and scheduling Programming and configuring Applying software updates (i.e., keys, patches)

Critical system components Installed base of system equipment and software Component availability Reliability analysis Sourcing of spare parts Acquiring and organizing information Analyzing data

Provide a system management plan by performing preventive maintenance, implementing backups, and designing recovery plans in order to avoid and recover from system failures.

Knowledge of: 1. 2. 3. 4. Skill in: 1. 2. 3. 4. 5.

Automation systems Acceptable system downtime Preventative and maintenance procedures Backup practices (e.g., frequency, storage media, storage location) Acquiring and organizing Leading Managing crises Performing backups and restores Using system tools

ISA Certified Automation Professional Job Analysis Study

57

Task 12:

Follow a process for authorization and implementation of changes in accordance with established standards or practices in order to safeguard system and documentation integrity.

Knowledge of: 1. 2. 3. Skill in: 1. 2.

Management of change procedures Automation systems and documentation Configuration management practices Programming and configuring Updating documentation

ISA Certified Automation Professional Job Analysis Study

58

Appendix A:

Contributors ISA would like to thank these individuals and their employers for their contribution of time, expertise, and enthusiasm for the Certified Automation Professional (CAP) program.

CAP Steering Team: Vernon Trevathan, Chair, Principal Consultant Control & Integration Management ,LLC, MO

Gerald Wilbanks, Principal Documentation & Engineering Services, LLC, AL

Ken Baker (retired) Eli Lilly, IN

Paul Galeski, President Maverick Technologies, IL

Additional Contributors: Dave Adler, Senior Engineering Consultant Eli Lilly, IN

Greg McMillan (retired) Austin, TX

Dan Bielski, Vice President Benham Systems, MI

Jeff Miller, Director of Project Management Interstates Control Systems, Inc., IA

Joe Bingham, Environmental Specialist Sempra Energy Solutions, CA

Dave Panish, President Enterprise Automation, Inc., CA

Brent Carlson, Systems Engineer 3M, MN

Art Pietrzyk, Automation Engineer Rockwell Automation, OH

Alan Carty, President Automationtechies.com, MN

Jonathan Pollet, President PlantData Technologies, TX

Dr. Gerald Cockrell, Professor Indiana State University, IN

Doug Ratzlaff, VP Americas Project Excellence Emerson Process Management, AB

Skip Holmes, Associate Director – Control & Information Systems, Corporate Engineering Technologies Proctor & Gamble, OH

Joe Ruder, Principal Controls Engineer Nestle' Purina Petcare, MO

Gavin Jacobs, Principal Engineer Emerson Process Management, AB Lee Lane, Manager of Applications Engineering Rockwell Automation, OH Bob Lindeman, Senior Project Manager Aerospace Testing Alliance, TN Ron Lutes, Vice President Performance Solutions COMPEX, MO Paul Maurath, Technical Section Head P&G, OH

ISA Certified Automation Professional Job Analysis Study

Nicholas Sands, Control Engineer E I du Pont, NJ George Skene, Senior Controls Engineer The Benham Companies, Inc., MI Chris Stephens, Design Engineer III Fluor Corporation, TX Ken Valentine, Director Design Engineering – Control Systems Fluor Corporation, TX Jeff White, Control Engineer Interstates Control Systems, Inc., IA

59

Appendix B: Other Responses Table VII. Other Primary Responsibility in Current Position automation sales/support Control System Engineering Corp Controls Manager & Tech Direction Setting Corporate management Corporate Management CSE design consultant Engineering Consultant Engineering Management Engineering Research Environmental Engineering General Management Instrumentation Sales Management (2) Management of automation teams across the US Manufacturing systems and Computer systems validat Project Manufacturing Eng. research and development Sales and Marketing Sales Management Senior management Staff Engineer System construction. Project management commish. System Design Systems Design Technical support Table VIII: Other Industry Worked In all Amunition/Exsplosives Bulding Materials education Education Food/PHarm/SPecialty Chemical Industrial gas Management Manufacturer - Vendor Manufacturing - general mining Mining Mining and Metals mining and refining Non-Metals Mining oil and gas pipelines and facilities oil well- field control Process Automation Supplier ISA Certified Automation Professional Job Analysis Study

Semiconductor (2) Semiconductor manuf./inhouse const. Software consultancy Systems Integration (all industries) Table IX. Other Employer’s Company or Organization Automation distributor Cable TV Central Engineering Combonation of End User and OEM Consultant Control Systems Manufacturer Education Instrument systems calibrations to NIST Manufacturing Pharmaceutical Manufacturer Pharmaceutical Mfg Semiconductor Process Manufacturing University

Table X. Other Certifications and Licenses Chartered Eng Chartered Engineer (UK) CPNE CSAT (Previously) CSVA EIT (4) Elec. contractorjourneyman electrician electrical contractor NARTE Electro-mechanical tech Process Control Technologi FCC MCP MCSA MCDBA CCNA MIChemE CEng N3 & T1 + 28years Exp. not certified in canada P.E. from Ontario Canada PhD Chem Eng RPT (Eng) WTP & WWTP Operator Licenses Table XI. Other Professional Societies and/or Organizations ACM APEGGA APEO ashrae 60

ASHRAE ASME AWWA British IChemE control engineering CPNE IChemE (UK) IChemE WBF ispe ISPE (4) ISPE ACM MBAA NFPA NSPE NSPE ISPE ASQ PDA

ISA Certified Automation Professional Job Analysis Study

PMI (2) SEMI Sematech SME (2) SME NSPE WBF (2) WEF Table XIII. Other Level of Education 3 year college diploma A.A.S.E. with approx. 3 years toward a B.S.I.T. MSEE and MBA Technical College/Technicon

61

Appendix C: Major/Focus of Highest Degree Accounting (2) Aircraft Maintenance Architectural Automation (2) automation engineering biological sciences Bioscience Business (2) Business Administration (9) Business Management ChE Chemical Engineering (28) Chemistry civil engineer Computer engineering Computer Engineering computer information systems computer integrated manufacturing Computer Integrated Manufacturing Technology Computer Science (2) Computers Science control Control Engineering Control System Engineering control systems Control Systems control systems engineering Controls Engineering Drafting&Design education EE (2) EE/control systems EET electrical & computer Engineering ELECTRICAL & COMPUTER ENGINEERING Electrical Engineering (68) electronic engineering technology Electronics (2)

ISA Certified Automation Professional Job Analysis Study

Electronics Engineering Electronics Engineering Technology Electronics Technology Electroninc Technology Engineering Mgmt Engineering Science-Control Systems option Environmental environmental science Environmental Sciences Forestry Industrial Maintenace Industrial Technology w/ Electronics minor Information Systems Information Technology Information Technology & Management instrumentation Instrumentation Instrumentation & Control (from DeVry Technical) Instrumentation and control engineering management information systems Manufacturing Systems Engineering Manufacturing Systems; MBA MBA (3) Measurement Mech.engHVAC sub-specialty Mechanical Engineering (11) mechanical engineering (Automatic Controls) metallurgical engineering Nuclear Engineering Operations Physics Power Electronics process control Process Control Science & Mathematics Thermal/Fluid Sciences Welding

62

Appendix D: Practice Analysis Survey

ISA – THE INSTRUMENTATION, SYSTEMS, AND AUTOMATION SOCIETY CERTIFIED AUTOMATION PROFESSIONAL

ROLE DELINEATION SURVEY MARCH 2004 Instructions for Completing the Instrumentation, Systems, and Automation Society Role Delineation Survey for the Certified Automation Professional This booklet contains the ISA – The Instrumentation, Systems, and Automation Society role delineation survey for the Certified Automation Professional along with instructional materials to aid you in completing it. Directions are provided at the beginning of each section of the survey. ISA – The Instrumentation, Systems, and Automation Society is developing a new certification for automation professionals to cover the entire field of automation application. We appreciate your time in completing this survey and we value your important input. In Section A, you are asked to complete a Confidential Survey, which provides us with the demographic information necessary to ensure that automation professionals working in various settings with differing backgrounds are represented in the data collection. In Section B, we have provided you with a list of definitions and terms that are used throughout the survey. We suggest that you review the Definition of Terms before responding to any survey questions.

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

63

In Section C, you are asked to review the Task Statements required for competent performance in each performance domain by the Certified Automation Professional, and rate each for importance, criticality, and frequency as they pertain to the role of the Certified Automation Professional. In Section D, you are asked to review the Performance Domains that define the role of the Certified Automation Professional. We ask that you rate the importance, criticality, and frequency of these domains as they pertain to the role of the Certified Automation Professional. Please review the entire booklet before responding to any of the questions. Your review will help you to understand our terminology and the structure of the role delineation survey. Please mark your responses directly in this booklet. Please return your completed survey by 2 April 2004 in the enclosed, self-addressed, stamped envelope to: CASTLE Worldwide, Inc. Post Office Box 570 Morrisville, North Carolina 27560-0570

Thank you in advance for your help with this very important project. ISA will use your responses to help determine the blueprint for the ISA Certified Automation Professional Examination.

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

64

Section A Confidential Survey Please fill in the following demographic information, which will be used to ensure that automation professionals working in various settings with differing backgrounds are represented in the data collection. All responses are kept strictly confidential by CASTLE Worldwide, Inc. Computer programs are used to sort the data. Neither individual persons or companies nor their particular data will be identifiable in any report generated using information obtained through this survey. Please check the appropriate boxes, or print your responses. 1. Gender (Please select one.) ‰

Male

‰

Female

2. Age (Please select one.) ‰

Under 30 years

‰

41-50 years

‰

31-40 years

‰

51-60 years

‰

61 years and above

3. In which state/province do you work? (Please list one.) ____________________________________________________________________________________ 4. How much experience do you have as an automation professional? (Please select one.) ‰

I am not an automation professional.

‰

6-10 years

‰

Less than 1 year

‰

11-15 years

‰

1-5 years

‰

More than 15 years

5. What percentage of your time do you spend working as an automation professional in your current position? (Please select one.) ‰

I am not an automation professional.

‰

51-75 percent

‰

Less than 25 percent

‰

76-100 percent

‰

25-50 percent

6. Which control area(s) do you work in on a daily basis? (Please select one.) ‰

Discrete/Machine Control

‰

Process/Liquid/Dry

‰

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

Both Discrete/Machine Control and Process/Liquid/Dry

65

7. What is your primary responsibility in your current position? (Please select one.) ‰

Field Engineering

‰

Project/System Engineering

‰

Information Systems

‰

Other (Please specify.)

‰

Operations and Maintenance

___________________________________

8. Which of the following best describes the industry in which you work? (Please select one.) ‰

Aerospace

‰

Petroleum Manufacturing

‰

Automotive Manufacturing

‰

Pharmaceutical Manufacturing

‰

Building Automation

‰

Plastics Manufacturing

‰

Chemical Manufacturing

‰

Pulp and Paper Manufacturing

‰

Consumer Goods

‰

Textiles/Fabrics Manufacturing

‰

Electrical/Electronic Manufacturing

‰

Transportation

‰

Engineering and Construction

‰

Utilities

‰

Environmental/Waste

‰

Water/Waste

‰

Food and Beverage Manufacturing

‰

Other (Please specify.)

‰

Machinery Manufacturing

‰

Metals Manufacturing

___________________________________

9. Which of the following best describes your current employer's company or organization? (Please select one.) ‰

Control Systems Suppliers

‰

Original Equipment Manufacturer (OEM)

‰

End-Users (petro-chem, food and beverage, pulp and paper)

‰

Systems Integrators

‰

Other (Please specify.)

‰

Engineering and Design Firm

10. What certifications/licenses do you currently hold? (Please select all that apply.) ‰

CEM

‰

CSE

‰

PMP

‰

CQE

‰

MSCE

‰

Other (Please specify.)

‰

CCST

‰

PE

___________________

11. In which professional societies and/or organizations do you currently hold membership? (Please select all that apply.) ‰

AIChE

‰

ISA

‰

ASME

‰

UA

‰

CSIA

‰

Other (Please specify.)

‰

IBEW

‰

IEEE

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

___________________________________

66

12. What is your highest level of education? (Please select one.) ‰

High School/Secondary School

‰

Doctoral Degree

‰

Associate Degree

‰

Other (Please specify.)

‰

Bachelor’s Degree

‰

Master’s Degree

___________________________________

13. What is the major/focus of study of your highest degree? (e.g., measurement, business administration, mechanical engineering, electrical engineering) __________________________________________________________________________________ 14. What is your annual income? (Please select one.) ‰

Less than $20,000

‰

$80,000 to $110,000

‰

$20,000 to $49,999

‰

More than $110,000

‰

$50,000 to $79,999

Section B Definition of Terms Below are definitions of the terms found in this role delineation survey. Certified Automation Professional (CAP): The ISA Certified Automation Professional (CAP) has completed a four-year technical degree* and five years of experience working in automation. CAPs are responsible for the direction, definition, design, development/application, deployment, documentation, and support of systems, software, and equipment used in control systems, manufacturing information systems, systems integration, and operational consulting. *

There may be a variety of ways to combine education and experience to satisfy eligibility requirements for an introductory two-year period.

Performance Domain: The performance domains are the major responsibilities or duties that define the role of the Certified Automation Professional. Each performance domain may be considered a major heading in an outline and may include a brief behavioral description. There are six performance domains included in this survey, as identified by an expert panel: •

Feasibility Study



Definition



System Design



Development



Deployment



Operation and Maintenance

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

67

Task Statement: A task is an activity performed within a performance domain. Each performance domain consists of a series of tasks that collectively form a comprehensive and detailed description of each performance domain. Typically, task statements answer such questions as: •

What activity did you perform?



To whom or to what was your activity directed?



Why did you perform that activity?



How did you accomplish the activity?

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

68

Section C Evaluation of Performance Domains Instructions: You will be rating each performance domain identified by an expert panel on three dimensions: importance, criticality, and frequency. Please remember, the performance domains are the major responsibilities or duties that define the role of the Certified Automation Professional. Each performance domain may be considered a major heading in an outline. There are six performance domains included in this survey, as identified by an expert panel. Each performance domain consists of a series of tasks that collectively form a comprehensive and detailed description of each performance domain. A task is an activity performed within a performance domain. In this section, you will validate the performance domains. If you are unclear what areas a performance domain covers, please review Section D. Importance: Importance is defined as the degree to which knowledge in the performance domain is essential to the role of the Certified Automation Professional. Indicate how important each performance domain is to the Certified Automation Professional. Rate each of the six performance domains by using the scale below. Please assign each performance domain only one rating. DO NOT RANK THE DOMAINS. Select the number of the description below that best exemplifies your rating for each performance domain, and write that number in the space provided next to each performance domain.

1 = Slightly Important. Performance of tasks in this domain is only slightly essential to the job performance of the Certified Automation Professional. 2 = Moderately Important. Performance of tasks in this domain is only moderately essential to the job performance of the Certified Automation Professional. 3 = Very Important. Performance of tasks in this domain is clearly essential to the job performance of the Certified Automation Professional. 4 = Extremely Important. Performance of tasks in this domain is absolutely essential to the job performance of the Certified Automation Professional. Rating of Importance

Performance Domain 1. Feasibility Study 2. Definition 3. System Design 4. Development 5. Deployment 6. Operation and Maintenance

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

69

Criticality: Criticality is defined as the potential for harmful consequences to occur if the Certified Automation Professional is not knowledgeable in the performance domain. Indicate the degree to which the inability of the Certified Automation Professional to perform tasks within the performance domain would be seen as causing harm to employers, employees, the public, and/or other relevant stakeholders. Harm may be physical, emotional, financial, etc. Rate each of the six performance domains by using the scale below. Please assign each performance domain only one rating. DO NOT RANK THE DOMAINS. Select the number of the description that best exemplifies your rating for each performance domain, and write that number in the space provided next to each performance domain. 1 = Minimal or No Harm. Inability to perform tasks within this performance domain would lead to error with minimal adverse consequences. 2 = Moderate Harm. Inability to perform tasks within this performance domain would lead to error with moderate adverse consequences. 3 = Substantial Harm. Inability to perform tasks within this performance domain would lead to error with substantial adverse consequences. 4 = Extreme Harm. Inability to perform tasks within this performance domain would definitely lead to error with severe adverse consequences. Rating of Criticality

Performance Domain 1. Feasibility Study 2. Definition 3. System Design 4. Development 5. Deployment 6. Operation and Maintenance

Frequency: What percent of time does the Certified Automation Professional spend performing duties associated with each domain? Write the percentage in the space provided next to each domain. The total must equal 100 percent. Percent of Time

Performance Domain 1. Feasibility Study 2. Definition 3. System Design 4. Development 5. Deployment 6. Operation and Maintenance

100%

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

70

Section D Evaluation of Task Statements In this section you will rate the task statements associated with each of the six domains on three dimensions – importance, criticality, and frequency – according to the scales below. Please remember, a task is an activity performed within a performance domain. As previously discussed, the performance domains are the major responsibilities and duties that define the role of the Certified Automation Professional. In this section, you will validate the tasks. If you are unclear about the relationship between the performance domains and the tasks, please review Section C. Rating Scales Importance

Criticality*

Frequency

1 – Slightly Important

1 – Causing Minimal or No Harm

1 – About Once Per Year or Never

2 – Moderately Important

2 – Causing Moderate Harm

2 – About Once Per Month

3 – Very Important

3 – Causing Substantial Harm

3 – About Once Per Week

4 – Extremely Important

4 – Causing Extreme Harm

4 – About Once Per Day or More Often

*The amount of harm that could be caused by performing the task incompetently.

Circle the number corresponding to the Importance, Criticality, and Frequency rating for each task statement.

_____________________________________________________________________________________ __________ DOMAIN I: FEASIBILITY STUDY Task 1: Define the preliminary scope through currently established work practices in order to meet the business need. Task 2: Determine the degree of automation required through cost/benefit analysis in order to meet the business need. Task 3: Develop a preliminary automation strategy that matches the degree of automation required by considering an array of options and selecting the most reasonable in order to prepare feasibility estimates. Task 4: Conduct technical studies for the preliminary automation strategy by gathering data and conducting an appropriate analysis relative to requirements in order to define development needs and risks Task 5: Perform a justification analysis by generating a feasibility cost estimate and using an accepted financial model in order to determine project viability. Task 6: Create a conceptual summary document by reporting preliminary decisions and assumptions in order to facilitate "go"/"no go" decision making.

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

IMPORTANCE

CRITICALITY

FREQUENCY

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

71

Please list any tasks related to Domain I that you think may have been overlooked. _____________________________________________________________________________________ _____________________________________________________________________________________

DOMAIN II: DEFINITION Task 1: Determine operational strategies through discussion with key stakeholders and using appropriate documentation in order to create and communicate design requirements.

IMPORTANCE

CRITICALITY

FREQUENCY

1 2 3 4

1 2 3 4

1 2 3 4

Rating Scales Importance

Criticality*

Frequency

1 – Slightly Important

1 – Causing Minimal or No Harm

1 – About Once Per Year or Never

2 – Moderately Important

2 – Causing Moderate Harm

2 – About Once Per Month

3 – Very Important

3 – Causing Substantial Harm

3 – About Once Per Week

4 – Extremely Important

4 – Causing Extreme Harm

4 – About Once Per Day or More Often

*The amount of harm that could be caused by performing the task incompetently.

DOMAIN II: DEFINITION (CONTINUED) Task 2: Analyze alternative technical solutions by conducting detailed studies in order to define the final automation strategy. Task 3: Establish detailed requirements and data including network architecture, communication concepts, safety concepts, standards, vendor preferences, instrument and equipment data sheets, reporting and information needs, and security architecture through established practices in order to form the basis of the design. Task 4: Generate a project cost estimate by gathering cost information in order to determine continued project viability. Task 5: Summarize project requirements by creating a basis-ofdesign document and a user-requirements document in order to launch the design phase.

IMPORTANCE

CRITICALITY

FREQUENCY

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

Please list any tasks related to Domain II that you think may have been overlooked. _____________________________________________________________________________________ _____________________________________________________________________________________

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

72

DOMAIN III: SYSTEM DESIGN Task 1: Perform safety and/or hazard analyses, security analyses, and regulatory compliance assessments by identifying key issues and risks in order to comply with applicable standards, policies, and regulations. Task 2: Establish standards, templates, and guidelines as applied to the automation system using the information gathered in the definition stage and considering human-factor effects in order to satisfy customer design criteria and preferences. Task 3: Create detailed equipment specifications and instrument data sheets based on vendor selection criteria, characteristics and conditions of the physical environment, regulations, and performance requirements in order to purchase equipment and support system design and development. Task 4: Define the data structure layout and data flow model considering the volume and type of data involved in order to provide specifications for hardware selection and software development. Task 5: Select the physical communication media, network architecture, and protocols based on data requirements in order to complete system design and support system development. Task 6: Develop a functional description of the automation solution (e.g., control scheme, alarms, HMI, reports) using rules established in the definition stage in order to guide development and programming. Task 7: Design the test plan using chosen methodologies in order to execute appropriate testing relative to functional requirements.

IMPORTANCE

CRITICALITY

FREQUENCY

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

Rating Scales Importance

Criticality*

Frequency

1 – Slightly Important

1 – Causing Minimal or No Harm

1 – About Once Per Year or Never

2 – Moderately Important

2 – Causing Moderate Harm

2 – About Once Per Month

3 – Very Important

3 – Causing Substantial Harm

3 – About Once Per Week

4 – Extremely Important

4 – Causing Extreme Harm

4 – About Once Per Day or More Often

*The amount of harm that could be caused by performing the task incompetently.

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

73

DOMAIN III: SYSTEM DESIGN (CONTINUED) Task 8: Perform the detailed design for the project by converting the engineering and system design into purchase requisitions, drawings, panel designs, and installation details consistent with the specification and functional descriptions in order to provide detailed information for development and deployment. Task 9: Prepare comprehensive construction work packages by organizing the detailed design information and documents in order to release project for construction.

IMPORTANCE

CRITICALITY

FREQUENCY

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

Please list any tasks related to Domain III that you think may have been overlooked. _____________________________________________________________________________________ _____________________________________________________________________________________

DOMAIN IV: DEVELOPMENT Task 1: Develop Human Machine Interface (HMI) in accordance with the design documents in order to meet the functional requirements. Task 2: Develop database and reporting functions in accordance with the design documents in order to meet the functional requirements. Task 3: Develop control configuration or programming in accordance with the design documents in order to meet the functional requirements. Task 4: Implement data transfer methodology that maximizes throughput and ensures data integrity using communication protocols and specifications in order to assure efficiency and reliability. Task 5: Implement security methodology in accordance with stakeholder requirements in order to mitigate loss and risk. Task 6: Review configuration and programming using defined practices in order to establish compliance with all design requirements. Task 7: Test the automation system using the test plan in order to determine compliance with functional requirements. Task 8: Assemble all required documentation and user manuals created during the development process in order to transfer essential knowledge to customers and end users.

IMPORTANCE

CRITICALITY

FREQUENCY

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

Please list any tasks related to Domain IV that you think may have been overlooked. _____________________________________________________________________________________ _____________________________________________________________________________________

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

74

Rating Scales Importance

Criticality*

Frequency

1 – Slightly Important

1 – Causing Minimal or No Harm

1 – About Once Per Year or Never

2 – Moderately Important

2 – Causing Moderate Harm

2 – About Once Per Month

3 – Very Important

3 – Causing Substantial Harm

3 – About Once Per Week

4 – Extremely Important

4 – Causing Extreme Harm

4 – About Once Per Day or More Often

*The amount of harm that could be caused by performing the task incompetently.

DOMAIN V: DEPLOYMENT Task 1: Perform receipt verification of all field devices by comparing vendor records against design specifications in order to ensure that devices are as specified. Task 2: Perform physical inspection of installed equipment against construction drawings in order to ensure installation in accordance with design drawings and specifications. Task 3: Install configuration and programs by loading them into the target devices in order to prepare for testing. Task 4: Solve unforeseen problems identified during installation using troubleshooting skills in order to correct deficiencies. Task 5: Test configuration and programming in accordance with the design documents by executing the test plan in order to verify that the system operates as specified. Task 6: Test communication systems and field devices in accordance with design specifications in order to ensure proper operation. Task 7: Test all safety elements and systems by executing test plans in order to ensure that safety functions operate as designed. Task 8: Test all security features by executing test plans in order to ensure that security functions operate as designed. Task 9: Provide initial training for facility personnel in system operation and maintenance through classroom and handson training in order to ensure proper use of the system. Task 10:Execute system-level tests in accordance with the test plan in order to ensure the entire system functions as designed. Task 11:Troubleshoot problems identified during testing using a structured methodology in order to correct system deficiencies. Task 12:Make necessary adjustments using applicable tools and techniques in order to demonstrate system performance and turn the automated system over to operations.

IMPORTANCE

CRITICALITY

FREQUENCY

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

Please list any tasks related to Domain V that you think may have been overlooked. _____________________________________________________________________________________ _____________________________________________________________________________________ _____________________________________________________________________________________

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

75

Rating Scales Importance

Criticality*

Frequency

1 – Slightly Important

1 – Causing Minimal or No Harm

1 – About Once Per Year or Never

2 – Moderately Important

2 – Causing Moderate Harm

2 – About Once Per Month

3 – Very Important

3 – Causing Substantial Harm

3 – About Once Per Week

4 – Extremely Important

4 – Causing Extreme Harm

4 – About Once Per Day or More Often

*The amount of harm that could be caused by performing the task incompetently.

DOMAIN VI: OPERATION AND MAINTENANCE Task 1: Verify system performance and records periodically using established procedures in order to ensure compliance with standards, regulations, and best practices. Task 2: Provide technical support for facility personnel by applying system expertise in order to maximize system availability. Task 3: Perform training needs analysis periodically for facility personnel using skill assessments in order to establish objectives for the training program. Task 4: Provide training for facility personnel by addressing identified objectives in order to ensure the skill level of personnel is adequate for the technology and products used in the system. Task 5: Monitor performance using software and hardware diagnostic tools in order to support early detection of potential problems. Task 6: Perform periodic inspections and tests in accordance with written standards and procedures in order to verify system or component performance against requirements. Task 7: Perform continuous improvement by working with facility personnel in order to increase capacity, reliability, and/or efficiency. Task 8: Document lessons learned by reviewing the project with all stakeholders in order to improve future projects. Task 9: Maintain licenses, updates, and service contracts for software and equipment by reviewing both internal and external options in order to meet expectations for capability and availability. Task10:Determine the need for spare parts based on an assessment of installed base and probability of failure in order to maximize system availability and minimize cost. Task 11:Provide a system management plan by performing preventive maintenance, implementing backups, and designing recovery plans in order to avoid and recover from system failures.

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

IMPORTANCE

CRITICALITY

FREQUENCY

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

1 2 3 4

76

Task 12:Follow a process for authorization and implementation of changes in accordance with established standards or practices in order to safeguard system and documentation integrity.

1 2 3 4

1 2 3 4

1 2 3 4

Please list any tasks related to Domain VI that you think may have been overlooked. _____________________________________________________________________________________ ____________________________________________________________________________________ _____________________________________________________________________________________

ISA-The Instrumentation, Systems, and Automation Society CAP Survey - March 2004

77

ISBN - 1-55617-903-0

View more...

Comments

Copyright ©2017 KUPDF Inc.
SUPPORT KUPDF