Vous êtes sur la page 1sur 3

Attribute Agreement Analysis

Overview | How to | Data | Example

Overview
The Attribute Agreement Analysis is used to assess the accuracy of subjective ratings by people. In general, it is more likely that subjective ratings are accurate and useful if there is substantial agreement in measurements among appraisers.

How to
1. 2. 3. 4. 5. Choose ProcessMA > Quality Tools > Attribute Agreement Analysis. In Rating, select the column containing the measurement data. In Samples, select the column containing the sample indicators. In Appraisers, select the column containing the appraiser indicators. Click OK.

Optional 6. 7. 8. In Known standard, select the column containing the known standard or master value for each sample. Check Attribute data is ordered, if your measurement data have more than two levels and are ordinal. Check Show Kappa and Kendall coef, if you want to display the kappa coefficient tables and Kendall's coefficient tables. To select a column of data into a textbox, double-click on any of the column names shown in the list on the left of the dialog box while in the textbox.

Note

Data
Rating: Text or Numeric. Samples: Text or Numeric; Must contain equal number of data points as the Rating. Appraisers: Text or Numeric; Must contain equal number of data points as the Rating. Known standard: Text or Numeric.

Example
You work in a garment factory and you have just trained 4 new quality controllers. The quality controllers need to determine is the garments are up-to-standard. You want to assess if these new quality controllers are ready for the job. You asked each quality controller to give their ratings on 10 garments on a five-point scale (-2, -1, 0, 1, 2). 1. 2. 3. 4. 5. 6. 7. 8. 9. Open worksheet Gage.xls. Choose ProcessMA > Quality Tools > Attribute Agreement Analysis. In Rating, select C Rating. In Samples, select B Garment. In Appraisers, select A Controller. In Known standard, select D Standard. Check Attribute data is ordered. Check Show Kappa and Kendall coef. Click OK.

Attribute Agreement Analysis: Rating Each Appraiser VS Standard


Assessment Agreement Appraiser Frances Jane John Mary # Inspected 10 10 10 10 # Matched 9 9 6 10 Percent 90 90 60 100 95% CI (55.5, 99.75) (55.5, 99.75) (26.24, 87.84) (74.11, 100)

Between Appraisers
Assessment Agreement # Inspected 10 # Matched 2 Percent 20 95% CI (2.521, 55.61)

All Appraisers VS Standard


Assessment Agreement # Inspected 10 # Matched 2 Percent 20 95% CI (2.521, 55.61)

Appraiser VS Standard
120 100 80 60 40 20 Frances Jane John Mary

Interpretation The Each Appraiser VS Standard assessment agreement table shows that John was only able to matched 6 out of the 10 assessments while Mary was able to match all of them. The confidence interval of %Matched

is shown in the table and also plotted as a chart. Based on this study, you conclude than John is in most need of more training.

Vous aimerez peut-être aussi