Académique Documents
Professionnel Documents
Culture Documents
1983
339
Job analysis is a process by which jobs are subdivided into elements, such
as tasks, through the application of a formalized, systematic procedure for
data collection, analysis, and synthesis (McCormick, 1976). At present numerous methods are available for accomplishing job analysis (Biddle, 1977;
Christal, 1974; Fine & Wiley, 1971; Fleishman, 197S; McCormick, Jeanneret, & Mecham, 1972; Primoff, 1975; Tornow & Pinto, 1976; U.S. Department ofLabor, 1972). This plethora of methods may be considered
to be a fortunate state of affairs, or a dilemma. The essence of the
dilemma is whether any method of job analysis is superior, relative to
any other method, in its validity and usefulness (Ash & Levine, 1980;
Levine, Ash,
& Bennett, 1980; Prien, 1977).
Unfortunately, very few systematic, scientifically conducted studies of
this problem are available. Much of the previous work may be characterized
more as case studies and experience-based accounts than as carefully controlled, experimental studies (Levine et al., 1980). It will be quite some time
before the necessary scientific evidence is accrued. Yet job analyses continue to be conducted, apparently at an increasing rate. These considerations led Ash and Levine (1980) to pose a mid-range strategy to obtain the
information for comparing methods. The strategy consisted of gathering
systematically the opinions of experienced job analysts about the quality
and practicality of available job analysis methods. The information yielded
by this approach would be supplemented and eventually supplanted as scientific evidence on the relative efficiency and effectiveness of job analysis
methods grows.
The present authors selected seven job analysis methods for this study
and had experienced job analysts evaluate them by means of a mailed questionnaire. The seven job analysis methods were the critical incident technique
1
This research was funded under Law Enforcement Assistance Adminlstration, Office or
Criminal Justice Education and Training, Grant #79-DFAX-0195, Frank Sistrunk and Philip
Smith, Principal Investigators. Opinions and conclusions are strictly those of the authors and do not
necessarily reflect an official position of the Office or Criminal Justice Education and Training or
the Law Enforcement Assistance Administration. The authors gratefully acknowledge the assistance
of Susan Hensley and Jeffrey Jones, who participated in the development of the survey
instrument.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
340
June
Subjects. Of those solicited through advertisements and personal contacts, 93 participants (a 61 percent response rate) completed and returned
questionnaires. Participants were required to have used at least two different job analysis methods and to have analyzed at least two different job
classes in their past work. Authors of job analysis methods included in this
survey were not invited or allowed to serve as participants.
The median age of the participants was 33. Although the median amount
of job analysis experience was 6.3 years, 86 percent of the participants had
in excess of 2.5 years of experience; 82 percent were male. Participants
were affiliated with colleges and universities (18 percent), state and local
govern- ment agencies (18 percent), private business (17 percent), private
consulting firms (13 percent), the federal government (10 percent), quasipublic firms (e.g., utilities) (6 percent), the military (3 percent), and
miscellaneous (e.g., nonprofit research organizations) or more than one
type of organization (14 percent).
Survey Instrument. The survey instrument was composed of two parts.
Part 1 contained descriptions of the seven job analysis methods. Each of
these was abstracted from published material, technical reports, and, in
some cases, personal communications with the developers of the method.
Attempts were made to keep length comparable and reading difficulty to
a minimum. The descriptions were edited to remove any evaluative statements. The outline used for all descriptions contained the following headings: name of technique; primary authors/developers; units of analysis,
scales and rationale for their use; types of persons who may gather information; types of persons who may supply the data; persons who may analyze/interpret the information; data gathering instruments and techniques;
form of final product; procedure; and time for completion. If information about any of these aspects was lacking in manuals or other sources,
it was left out of the description. The ordering of the descriptions was based
on previous research on familiarity (Levine, Bennett, & Ash, 1979). Descriptions of the methods presumed to be less familiar were placed at the
beginning of Part 1 in an attempt to counteract the effect of familiarity.
Part 2, the questionnaire, consisted of three sections. Section I covered
the effectiveness of the job analysis methods. In the first part of the section, respondents were to rate each of the methods on a 5-point scale of
how effective the method is for each of 11 organizational purposes (e.g.,
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
1983
341
job description, job classification). A response of I indicated that the information yielded by a method was not at all comprehensive and was low
in quality. A response of 5 indicated that the information was extremely
comprehensive and extremely high in quality. The 11 purposes are listed
in Table I. The section also contained supplemental questions relating to
whether the respondent preferred a combination of methods to one alone
when all organiz.ational purposes were considered, whether a particular
com- bination of methods was used in the respondent's job analysis
work, and whether the respondent knew of a job analysis method as good
as or better than any of the seven methods included in the survey.
Section II of the questionnaire required that the respondent rate the
methods on 11 practicality scales (e.g., respondent/user acceptability; see
Table 2 for a listing of the practicality variables). These practicality concerns were rated on a 5-point scale (1 = low practicality to 5 = high practicality); for example, the scale for the cost factor ranged from I = prohibitively high costs would be expected to 5 = negligible costs-job analyses
would be relatively cheap.
Section III of the questionnaire covered demographic information and
extent of familiarity with the seven job analysis methods (1 = not at all familiar to 4=highly familiar).
Procedure. Potential participants were advised of the nature of the study
and of the criteria for participation. A pledge was made to keep all individual responses confidential. Survey participants reviewed the descriptions
of the methods, completed the ratings and other questions based on the
descriptions and their previous experience with the methods, and returned
the questionnaires in stamped, preaddressed envelopes.
Statistical Analyses. The primary analytical technique employed on the
survey data was a 2-way univariate ANOVA with repeated measures of job
analysis methods. Organizational affiliation was the second independent
variable. The job analysis variable had seven levels, each corresponding
to a method. The organizational affiliation variable had four levels: academic, public sector, private sector, and miscellaneous, which consisted
of those respondents who could not be classified in the other three groups.
The dependent variables in these 22 primary analyses were the effectiveness
and practicality ratings. If a significant main effect of job analysis
methods was found for any dependent variable, Tukey tests were used to
analyze pairwise mean differences between job analysis methods. Only
results sig- nificant at p < .OS, two-tailed, will be discussed.
Results
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
342
June
s
c-s"8 -N'"--Soc;:;--::-ia1
t=i l -=- 1-
ci 6 6NNCNciN6
NCNCN6N6
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
1983
343
Levine, Ash,
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
344
June
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
1983
34S
Levine, Ash,
use "off-the-shelf." All five of the other methods were rated about the
same. In addition, the 2-way analysis of variance showed a significant job
analysis method x organizational affiliation interaction effect for ratings
on this area of practicality, F(l8, 504)= l.89,p<.05. In this area of practicality, the private sector respondents tended to rate FJA and TI/CODAP
higher than did public sector respondents. Academic and miscellaneous sector respondents tended to rate the TTA higher than did other respondents,
and the ARS higher than public sector respondents.
Reliability. TI/CODAP and the PAQ were rated significantly higher in
reliability than were the other job analysis methods. Although FJA was
rated significantly lower in reliability than TI/CODAP and the PAQ, it
was rated significantly higher than the four remaining methods. The CIT
was rated significantly lower in reliability than all the job analysis methods
except the JEM.
Time to completion. PAQ, ARS, and TTA job analysis studies were rated
as taking less calendar time to complete than comparable job analysis studies
using other methods, although the ratings for the TTA were not significantly
different from those for the JEM. TI/CODAP studies were rated as requiring the most time for completion, although TI/CODAP ratings were
not significantly different from CIT ratings in this regard. A significant
job analysis method x organizational affiliation interaction effect for time
to completion ratings was also found, F (18, 516)=2.34, p<.01. In this
area of practicality, the academic sector respondents tended to rate TTA
and ARS as taking less time for completion than did the private sector respondents. The miscellaneous sector respondents tended to rate the PAQ
as taking less time to complete and the CIT as taking more time for completion then did the public and private sector respondents. The private sector
respondents tended to rate TI/CODAP as taking less time for completion
than did the other respondents.
Additional Results. Combinations of job analysis methods generally were
preferred over one method alone. Of the 93 respondents, 80 preferred a
combination of methods, 9 preferred a single method, and 4 were not sure
<x2= 116.58, p-c .001). The most frequently chosen combinations of methods included the following: (1) PAQ and TI/CODAP; (2) CIT and FJA;
(3) CIT, TI/CODAP, and FJA; (4) PAQ and CIT; and (5) ARS and
TI/CODAP. Job analysts who use combinations of methods reported that
the advantages of using combinations of methods outweigh the added costs.
Of the 74 respondents who reported using combinations of methods, 50
stated that it was worth the added costs, 7 stated that it was not, and 17
were not sure (x2=41.1, p< .001).
Discussion
It is readily apparent that the results of an opinion survey such as this
are not a substitute for careful experimentation. To resolve definitively the
issue of relative efficacy among job analysis methods, a programmatic
series
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
346
June
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
1983
347
data gathering approach used within a method {e.g., structured questionnaire, brainstorming meetings) is of secondary importance, although a significant factor nonetheless. Future research on these variables may lead
to the synthesis of that "ideal" method, which is beyond reach at the present stage of knowledge. Perhaps that ideal method will overcome some of
the limitations of current job analysis technology; namely, its relative inability to deal with the job as it should be, rather than how it is now, and
with changes in the job over time.
Furthermore, the results provide some guidance to the developers of those
job analysis methods included in this study so that they may be modified
to serve user needs more adequately. In addition, the path to comparing
combinations of methods has been cleared to some degree, so that future
research on suitable combinations of methods may be facilitated. No other
studies are known to have addressed the efficacy of combinations of
methods.
References
Ash, R. A., & Levine, E. L. A framework for evaluating job analysis methods. Personnel, 1980, 57(6),
53-59.
Biddle, R. E. Brief GOJA: A two hour job analysis. Sacramento, Cal.: Biddle and Associates, 1977.
Christal, R. E. The United
selected
documents in psychology, 1974, 4, 61 (Ms. No. 651).
Fine, S., & Wiley, W. An introduction to functional job analysis: Methods/or manpower analysis.
Kalamazoo, Mich.: Upjohn Institute for Employment RCY.at'Ch, 1971 (Methods for Manpower
Anal- ysis #4).
Flanagan, J.
C. The critical
Prien, E. P. The function of job analysis in content validation. Personnel Psychology, 1971, 30, 167174.
Primoff, E. S. How to prepare and conductjob element examinations. Washington, D.C.: U.S. Government Printing Office, 1975.
Tornow, W. W., & Pinto, P. R. The development of a managerial job taxonomy: A system for describing, classifying and evaluating executive positions. Journal ofApplied Psychology, 1976, 61, 410-418.
U.S. Department of Labor. Handbook/oranal)'Vngjobs. Washington, D.C.: U.S. Government Printing
Office, 1972.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
348
June
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.