Vous êtes sur la page 1sur 5

4th International Computer & Instructional Technologies Symposium. Seluk University, Konya, TURKEY, 2010; ISBN: 978-605-61434-2-7. pp.

1035-1040

ANALYTIC HIERARCHY PROCESS APPROACH TO DECISIONS ON INSTRUCTIONAL SOFTWARE


Murat Paa UYSAL, Ph.D. Turkish Military Academy, Defense Sciences Institute mpuysal@kho.edu.tr

ABSTRACT: Developments in computer and information technologies continue to give opportunities for designing advanced instructional software while entailing objective and technical evaluation methodologies. With respect of this, one of the hard decisions that the educators have to make is selecting qualified instructional software. Functions of the courseware or types of the instructional software determine the underlying main evaluation criteria. In this study, these criteria are restricted to instructional design, content and technical features of instructional software for the simplicity purposes. A decision on any instructional software and its analysis process need systematic and structured guidance along with the necessary analytical tools that would lead to better decisions. Analytic Hierarchy Process (AHP) is thought to be one of the appropriate methods, which would meet the requirements in question. AHP enables individuals to structure complex problems in a form of hierarchy for evaluating quantitative and qualitative factors, and it addresses how to determine the relative importance of a set of alternatives in a multi-criteria decision-making environment. With this study, it is aimed to show that AHP is an effective quantitative evaluation method to be used for decisions on instructional software and it promises a candidate tool for instructional technology-related decisions. Key words: Instructional software, analytic hierarchy process, decision INTRODUCTION Humans have to make decisions to select or act on something for different purposes, ranging from simple to complex, conscious to unconscious. Furthermore, decisions are inevitable part of our daily lives, and their effects are certainly seen as they are expected or not. At one time or another, all of the individuals have wished that a difficult decision was easy to make, and there is a simple and strait forward way to follow up. For example, in purchasing a utility, there are many factors to consider, such as price, flexibility, brand name, support of manufacturer, etc. Clemen (1996) indicates that the factors such as complexity, uncertainty, multiple objectives, and different perspectives in decision making constitute the basic sources of difficulty. As a result, in this type of multi-factor decision-making, a person considers the various factors intuitively or subjectively, while feeling the need of a quantitative approach. Most of the decision-problems include a number of factors requiring multi-factor evaluation processes. Therefore, a decision and its analysis need systematic and structured guidance along with the necessary analytical tools that would lead to better decisions. Developments in computer and information technologies continue to give opportunities for designing advanced instructional software while entailing objective and technical evaluation methodologies. Occasionally, one of the hard decisions that the educators have to make is selecting a qualified instructional software. It is not possible to mention about an effective computer aided instruction with an unqualified instructional software as it gives cause for time and resource waste (Merrill, 1996). Functions of the courseware or type of the instructional software (drill, simulation, tutorial, problem solving, instructional game and etc) determine the underlying main evaluation criteria. These criteria are restricted to instructional design, content and technical features of the instructional software for the simplicity purposes of this study. The main point is that the Analytic Hierarchy Process (AHP) is thought to be one of appropriate methods, which would meet the requirements of instructional technology-related decisions. THE ANALYTIC HIERARCHY PROCESS AHP is a systematic multi-criteria evaluation method, which is developed by Saaty (1980), and has found a wide range of place in many solutions of different types of problems. AHP enables individuals to structure complex problems in a form of hierarchy for evaluating quantitative and qualitative factors, and it addresses how to determine the relative importance of a set of alternatives in a multi-criteria decision-making environment. It helps decision makers to determine the various factors with their weights, which are pointing out their importance, and laying out the hierarchy of the decision. First, the decision maker starts the overall procedures by defining the problem and setting up the goal related with the problem. Second, he or she determines the criteria reflecting the experts opinions. Third, the hierarchy is structured and reviewed. Forth,

4th International Computer & Instructional Technologies Symposium. Seluk University, Konya, TURKEY, 2010; ISBN: 978-605-61434-2-7. pp. 1035-1040
iteratively and respectively; a) the pair wise comparisons are made for each alternative, b) criteria weights are calculated, c) consistency is checked. Fifth, the weights of criteria are aggregated. Finally, weights are combined to rank the alternatives (Figure 1).

Define Problem & Set up Goal

Determine Criteria

Structure the Hierarchy

Make Pair-Wise Comparison

Check Consistency (CR) Yes


0<CR<0.1

Select the Alternative

Rank the Weights

Aggregate Weights of Criteria

No

Goal

Decision on Instructional Software

Criteria

Criterion-1

Criterion-2

Criterion-3

Criterion-n

Alternatives

Tutorial-1

Tutorial-2

Tutorial-3

Tutorial-n

Figure 1. AHP Flow Chart and Hierarchy Lay out of the Decision Structure THE ILLUSTRATIVE EXAMPLE Defining the Problem and Goal To illustrate an example of AHP approach to decisions on instructional software, we are supposing that we have limited our software alternatives from many to three, as Tutorial-1, Tutorial-2 and Tutorial-31 for the simplicity purposes our study. As an example, the problem or the goal of this study is to select the most appropriate tutorial out of three. Determining the Criteria It is possible to determine many criteria for the multi-factor evaluation process. However, these criteria are restricted to instructional design, content and technical features in order to simplify the calculation procedures. Pair-Wise Comparisons The key issue for AHP is the making iterative pair-wise comparisons. We need to compare two different alternatives (software) based on a selected criterion using a scale that ranges from 1 to 9. ( The meaning of 1 is equally preferred, 2-Equally to moderately preferred, 3-Moderately preferred, 4-Moderately to strongly preferred, 5-Strongly preferred, 6- Strongly to very strongly preferred, 7- Very strongly preferred, 8-Very to extremely strongly preferred, and 9 is Extremely preferred) We begin making comparisons by looking at the criterion instructional design, and then respectively compare Tutorial-1 with Tutorial-2, Tutorial-1 with Tutorial-3, and finally Tutorial-2with Tutorial-3 for scoring purposes. A pair-wise comparison matrix is intended to be constructed at the end of these procedures. This matrix reveals our preference for instructional design concerning the three tutorials. Assuming that we are an expert on instructional technology, it is determined that the Tutorial-1 is moderately preferred to Tutorial-2,

Review

Calculate Criteria Weights are

4th International Computer & Instructional Technologies Symposium. Seluk University, Konya, TURKEY, 2010; ISBN: 978-605-61434-2-7. pp. 1035-1040
Tutorial-1 is extremely preferred to Tutorial-3, and Tutorial-2 is strongly to very strongly preferred to Tutorial-3 (Table 1). We place 1 from upper left corner to the lower right corner of the matrix since each tutorial is equally preferred to itself. If the Tutorial-1 is moderately preferred to Tutorial-2 and it is scored as 3, and then Tutorial-2 will naturally be preferred to Tutorial-1 with the score of 1/3. We can complete the lower left of this matrix using the same logical approach (Table 2).
Table 1. Initial Comparisons
Instructional Design Tutorial-1 Tutorial-2 Tutorial-3 Tutorial-1 Tutorial-2 3 Tutorial-3 9 6

Table 2. Completed Comparison Matrix


Instructional Design Tutorial-1 Tutorial-2 Tutorial-3 Tutorial-1 1 1/3 1/9 Tutorial-2 3 1 1/6 Tutorial-3 9 6 1

Evaluations for the Criterion Evaluation procedures start aftermath of the completion of pair-wise comparisons. A decision maker executes the same evaluation steps iteratively in order that the relative importance of each criterion is determined clearly. Although each criterion needs to be handled individually and the results are combined with the next criterion calculations, we will only focus on the criterion Instructional Design because of limitations of this study, and the calculations of other criteria will be left to the readers. To start and make them easier, we convert matrix numbers to decimals, and then get the column totals (Table 3). The numbers in the matrix are divided by their respective column totals and a normalized matrix is achieved once the column totals have been found (Table 4).
Table 3. Matrix Converted to Decimals
Instructional Design Tutorial-1 Tutorial-2 Tutorial-3 Column Totals Tutorial-1 1 0.333 0.111 1.444 Tutorial-2 3 1 0.167 4.167 Tutorial-3 9 6 1 16.0

Table 4. Matrix Divided by Column Totals


Instructional Design Tutorial-1 Tutorial-2 Tutorial-3 Tutorial-1 0.692 0.230 0.077 Tutorial-2 0.720 0.240 0.040 Tutorial-3 0.562 0.375 0.063

The priorities for Instructional Design of the three tutorials are determined by finding the average of the rows from the matrix of numbers (Table 5). As it seen from the Figure 6, the criterion evaluations for Tutorial-1, Tutorial-2, and Tutorial-3 are 0.658, 0.282, and 0.060 respectively, and they are transferred to the decision matrix (Table 6). But before that, we need to determine the Consistency Ratio which ensures that the responses are consistent.
Table 5. Averages of the Rows
Instructional Design (0.692 + 0.720 + 0.562) / 3 Row Averages (0.230 + 0.240 + 0.375) / 3 (0.077 + 0.040 + 0.063) / 3 = 0.658 = 0.282 = 0.060

Table 6. Decision Matrix for Instructional Design


Criterion Instructional Design Content Technical Features Tutorial-1 0.658 Tutorial-2 0.282 Tutorial-3 0.060

Determining Consistency Ratio for the Criterion AHP regards the consistency as a cardinal consistency. As an example, if A is thought to be two times more important than B, and B is considered to be three times more important than C, then A should be six times more important than C. If the decision maker judges that A is less important than C, a judgmental error occurs and the prioritization matrix is accepted as inconsistent. Therefore, the Consistency Ratio (CR) is a value, which is indicating that how we are consistent with our answers. A higher ratio means that the decision maker is less consistent, whereas a lower one means he or she is more consistent. In terms of numbers, if the ratio is 0.10 or less, the decision makers answers are consistent. A consistency ratio with a value higher than 0.10 requires reevaluation of the responses, which are given for the original matrix of pair-wire comparisons. In general, the division of the Consistency Index (CI) by the value of Random Index (RI) gives us the CR. The basic formulas needed for the calculations of the CR are:

4th International Computer & Instructional Technologies Symposium. Seluk University, Konya, TURKEY, 2010; ISBN: 978-605-61434-2-7. pp. 1035-1040
CR = CI RI

and CI =

n n 1

(1)

n is the number of alternatives; RI is the index number obtained from the table with an entry value of n; Lambda () is achieved from the matrix operations of the Weighted Sum Vector and the Consistency Vector as follows: The Weighted Sum Vector = (0.658)(0.333) + (0.282)(1)
+ (0.060)(9) 2.044 + (0.060)(6) = 0.860 (0.658)(0.111) + (0.282)(0.167) + (0.060)(1) 0.180 2.042 / 0.658 3.103 = 0.860 / 0.282 = 3.051 0.180 / 0.060 3.009 + (0.282)(3) (0.658)(1 )

(2)

The Consistency Vector

(3)

CI =

n 3.103 + 3.051 + 3.009 n 3.054 - 3 = = 3.054 CI = = = 0.027 n 1 3 n 1 3 -1

(4)

CR =

CI 0.027 = = 0.047 is obtained by the value found by equation (4) and the value from RI table (Table 7) RI 0.58

Table 7. Random Index Table


Number of Alternatives (n) Random Index (RI) 2 0.00

3 0.58

4 0.90

5 1.12

6 1.24

7 1.32

8 1.41

9 1.45

10 1.51

It is possible to say that the decision maker is relatively consistent with his responses by looking at the CR, which has the value of 0.047. As a result, CR supports our original assessments of pair-wise comparison matrix. Evaluations for the Other Criteria So far, we have determined the evaluations for the criterion instructional design for all alternatives of tutorials. The same calculations can be easily made for the other criteria that are named as the Content and Technical Features. Assuming that we have performed the same pair-wise calculations, we end up with the final comparison matrix (Table 8). The next step is the determining the criteria weights. Rather than deciding them subjectively, the AHP is again an objective method to be used for finding the weights. Iterative calculations methods and the computations of each CR enable a decision maker to be sure that he is also consistent with his responses to criteria weights. Table 9 shows the weights of the criteria, which are also calculated in the same manner.
Table 8. Pair-wise Comparison Matrix
Criterion Instructional Design Content Technical Features Tutorial-1 0.658 0.087 0.497 Tutorial-2 0.282 0.182 0.398 Tutorial-3 0.060 0.750 0.107

Table 9. Weights of Criteria Table 10. The Final Decision


Criterion Instructional Design Content Technical Features Weights 0.082 0.682 0.236 Alternatives Tutorial-1 Tutorial-2 Tutorial-3 Evaluation Result 0.231 0.228 0.542

Selecting the Alternative Following the completion of the comparison and criteria weight matrixes, the last step is obtaining of the final decision matrix. We find it by a matrix multiplication including the comparison and the criteria weight matrixes as shown in Table 10. It is clear that the Tutorial-3 received the highest ranking and it should be selected as the best instructional software. CONCLUSION The purpose of a decision analysis is to help someone make systematic decisions about complex problems, and it involves a number of different techniques. The selection of the best instructional software from a set of alternatives, which satisfy all the required criteria, is a difficult task for a decision maker. This difficulty arises from the fact that the various software may have different characteristics, and not only one would posses all the values satisfying instructional and technical requirements. Therefore, any decision on instructional software and

4th International Computer & Instructional Technologies Symposium. Seluk University, Konya, TURKEY, 2010; ISBN: 978-605-61434-2-7. pp. 1035-1040
its analysis needs systematic and structured guidance as well as necessary analytical tools that would enable better decisions. AHP is one of these methods that address how to determine the relative importance of alternatives in a multicriteria decision-making environment. Its philosophy is purely based on the comparison of alternatives with respect to a criterion in pair-wise and systematic mode. The primary characteristics of AHP, which are structuring complexity, quantitative and objective measurement, and synthesis of analyses, make AHP a widely applicable decision-making method. As a result of this study, it is aimed to show that AHP is an effective quantitative evaluation method as well as a promising tool for instructional technology-related decisions.

REFERENCES Alessi, M.S. & Trollip S.R. (2001). Multimedia for Learning, Methods and Development Massachusetts USA. Pearson Education Company. Chiu, C. M., Hsu, M. H., Sun, S. Y., Lin, T. C., & Sun, P. C. (2005). Usability, quality, value and e-learning continuance decisions. Computers and Education, 45(4), 399416. Clemen, R. T. (1996). Making Hard Decisions. Belmont, CA: Wadsworth Publishing Company. Godse, M., Sonar, R., & Mulik, S. (2008). The Analytical Hierarchy Process Approach for Prioritizing Features in the Selection of Web Service. The Sixth European Conference on Web Services, Dublin, Ireland. Lin, H. F. (2010). An application of fuzzy AHP for evaluating course website quality. Computers & Education, 54, 877-888. Liaw, S. S. (2008). Investigating students perceived satisfaction, behavioral intention, and effectiveness of elearning: A case study of the blackboard system. Computers and Education, 51(2), 864873. Merrill L, M.D. (1996). Instructional Transaction Theory: an Instructional Design Model Based on Knowledge Objects. Retrieved from http://cito.byuh.edu/merril. Nydick, R. L. & Hill, R. P.(1992). Using the Analytic Hierarchy Process to Structure the Supplier Selection Procedure. International Journal of Purchasing and Materials Management. 28, 2, 31 Saaty, T. L. (1980). The Analytic Hierarchy Process. New York, NY: McGraw-Hill. Saaty, T. L.(1986). Axiomatic Foundation of the Analytic Hierarchy Process. Management Science, 32, 841855. Saaty, T. L. (1991). How to Make a Decision: The Analytic Hierarchy Process. European Journal of Operations Research, Vol. 48, 9-26. Shee, D. Y., & Wang, Y. (2008). Multi-criteria evaluation of the web-based e-learning system: A methodology based on learner satisfaction and its applications. Computers & Education, 50, 894-905. Tahriri, F., Osman, M. R.; Ali, A.; Yusuf, R. M.; & Esfandiary, A. (2008). AHP approach for supplier evaluation and selection in a steel manufacturing company. Journal of Industrial Engineering and Management, Vol. 1, 54-76. Vaidya, O. S. & Kumar, S. (2006). Analytic hierarchy process: An overview of applications. European Journal of Operations Research. 169, 1-29. Zhang, L., Wen, H., Li, D., Fu, Z., & Cui, S. (2009). E-learning adoption intention and its key influence factors based on innovation adoption theory. Mathematical and Computer Modelling.

Vous aimerez peut-être aussi