Vous êtes sur la page 1sur 10

Developing a Service Quality Construct: A Pedagogical Approach

Festus Olorunniwo, Tennessee State University Byron Pennington, Tennessee State University Maxwell K. Hsu, Tennessee State University

ABSTRACT This paper suggests an interesting pedagogical method that can be used to develop an operationalizable questionnaires for most service industry. Despite its use in the traditional classroom, marketing managers can also utilize the proposed approach as an analytical tool in identifying the operational strengths/weaknesses and thus be able to develop a better service program.

INTRODUCTION Among several studies that have been reported on attempts to measure the quality of service, the most prominent ones are the studies reported by Parasuraman, Ziethaml and Berry (PZB, 1988) and Cronin and Taylor (1992). PZB (1988) made the first systematic attempt to measuring service quality. The instrument developed is commonly referred to as the SERVQUAL. SERVQUAL is an instrument created as a means of tracking service quality across industries, determining the importance of key consumer perceptions, and categorizing consumers. The instrument uses five defining constructs to evaluate service quality. The scale in the instrument (consisting of 22 pairs of question items) is based on the concept that the difference between consumers perceptions (P) and expectations (E) of the service drives the judgment of the customer regarding the overall quality of the service. Perception (P) is defined as the customers judgment of the service organizations performance (PZB 1985, 1988). Since the conception of PZB (1985, 1988) and subsequent modification (Zeithaml, Berry and Parasuraman, 1990), the SERVQUAL instrument for measuring service quality has been subjected to a number of criticisms. Two of the issues raised are as follows: (a) the question items in the SERVQUAL construct are global in nature. It is consequently argued that the outcome of the survey is of little utility value for instituting an operational improvement process. Stated briefly, the SERVQUAL survey is not easily operationalizable; (b) the use of Expectation-Perception as a measurement score for service quality has been challenged by several authors. It is argued that perceptions alone can be sufficient as an evaluation of the global quality judgment. This study addresses issues 1 and 2 for the Assisted Living Industry. Utilizing the Perceptions (SERVPERF) only approach, the objective of this study was to propose a methodology that can ensure that service quality measurement is operationalizable in any industry context.

BACKGROUND SERVQUAL or SERVPERF? Several studies have challenged the P-E score as a measure of service quality (Cronin and Taylor, 1992; Quester et al., 1995; and Teas, 1993a, 1993b). Cronin and Taylor (1992) suggest that there is no real evidence to support the concept of performance minus expectations gap as a basis for measuring service quality. They argue that using performance scores alone (SERVPERF) gives a better measure of service quality. Quester et al. (1995) reinforce the arguments of Cronin and Taylor with respect to performance ratings and the performance gap measure as a result of studies carried out in Australian advertising industry. Teas (1993a, 1993b) also provide support to Cronin and Taylors view on the theoretical and operational ambiguity of the expectations element of SERVQUAL. Gronroos (1988, 1990) enumerate some of the criticisms of using expectations as follows: (i) if expectations are measured after the service experience or at the same time as the experiences then what is measured is not really expectations but something that has been biased by the experiences; (ii) it does not necessarily make sense to measure expectations prior to the service experience, either, because the expectations which customers have beforehand may not be the expectations to which they compare their experiences; and (iii) measuring expectations may not be the correct thing to do since experiences are perceptions of reality, and inherent in these perceptions are the prior expectations. Operationalization of SERVPERF Recall that we remarked that a criticism of SERQUAL is that the question items are global in nature. Thus, the outcome of administering the SERVQUAL instrument to the customers of a service is of little utility value for instituting an operational improvement process for the service. In this respect, Lapierres (1996) study provided a more comprehensive approach to operationalizing a service quality construct in that it links the conceptual definition and empirical indicators of the service quality construct. The premise that guided this approach are (a) service quality research is critically dependent on the quality of the operational measures; (b) given the nature of service, the search for universal conceptualization of service quality may be futile; and (c) the construct measurements are as important as the examination of substantive relationships. Using Gronroos (1988) service quality dimensions as the basis, Lapierre developed a 16-item instrument associated with the quality of telecommunication network. The advantage of that approach is that Gronroos proposed dimensions, untested prior to Lapierres work, integrated several studies such as PZBs (1988) five functional criteria (tangibles, reliability, responsiveness, assurance, and empathy) and the Lindquist (1987) criteria, tested in the passenger transportation sea services context (behavior of staff responsiveness, competence and courtesy, accessibility to customers, price levels, business hours, and purchase accessibility). Gronroos also considered Garvin (1983, 1987) criteria on factors affecting quality of manufactured goods (performance, features, reliability, conformance, durability, serviceability, aesthetics, and perceived quality. Arguing that Gronroos (1988) list constitutes a synthesis of several studies, Lapierre built on this foundation and proposed six dimensions for the facilitating and supporting services (lumped together as the auxiliary service that includes sales and repair

function in the telecommunications industry). The six dimensions he used are: Reliability and Trust, Attitude and Behavior [closely related to PZBs (1988) responsiveness], Ability to Recover, Accessibility and Flexibility [closely related to PZBs Empathy], Participation of Customers, and Knowledge and Skills [PZBs Assurance]. Lapierre (1996) stated that the items in each dimension were generated by the author through a review of the literature. His approach may not necessarily generate items that can be operationalized. For one thing, We agree with Lapierre (1996), that data for measuring service quality should be described in the language of those providing the data. METHODOLOGY AND RESULTS The assertion made in this paper is that service quality research is critically dependent on the quality of the operational measures. Consequently, the main objective of this research was to propose a methodology for generating operational measures of service quality. As indicated above, operationalizable data should be obtained through indicators described in the language of those providing the data. That is, the customers should be the source of the measurement items on a service quality construct. We propose a pedagogical methodology for soliciting the participation of focus group of customers or their surrogates in developing the items that constitute the scale. The focus groups of surrogate customers comprised teams of students in a semester course on Management of Service Organizations offered in an AACSB accredited college of business in a university located in a large metropolitan area of USA. They are surrogate customers because most of them have relatives whom they have visited in Assisted Living communities. Henceforth, we shall refer to these teams of students as customers. We define Assisted Living Community as a residential institution that provides medical care, non-medical care, social activities, companionship, etc. to a special market segment. It could refer to a community planned and operated to provide a continuum of accommodations for seniors including, but not limited to, independent living, congregate housing, or nursing facility. The term can therefore refer to a nursing facility --- a 24-hour nursing care for convalescent residents and those with long-term care illnesses where regular medical supervision and rehabilitation therapy are typically available. Also included are the Independent Living Communities that provide a convenient, secure living arrangement for seniors generally consisting of individual living units combined with an array of amenities that include meals, housekeeping, laundry, social activities, and transportation. The Pedagogical Approach In order to operationalize the service quality measurement scale, we guided the customers through the following three-step process: Step 1: Developing a Service Blueprint for a Typical Assisted Living Facility Figure 1 (found at the end of this paper) shows the service blueprint for the activities in an Assisted Living Community. It was developed by one of the teams that worked on the

Assisted Living Industry as a term-long project. A service blueprint depicts the interactions of a typical customer as he/she navigates through the system. A unique feature of the service blueprint is the distinction made between the high customer contact aspects of the service (i.e., that part of the process the customer sees) and those activities the customer does not see. This distinction is made through what is called the line of visibility on the chart. At each stage, failures that can possibly occur are identified and are listed beneath the chart. Then poka-yokes are applied to develop prevention to the failures. Poka-yokes (roughly translated from the Japanese as avoid mistakes) are procedures that block a mistake from becoming a service defect. These poka-yokes can be classified as three Tsthe task to be done (was the right transaction performed?), the treatment accorded to the customer (was the system customer friendly?), and the tangible or the environmental features of the service facility (was the layout of the website simple and easy to use). In a pedagogical context, the service blueprint allows the customer to understand the sequential stages of the service encounter. It also allows them to visualize possible failure incidences that affect customers perception of the quality of the service delivery process. Finally, it sets the stage for the next step that of developing the Walk-thru-Audit (WTA). Step 2: Developing the Walk-Thru-Audit A walk-thru-Audit attempts to trace the experience of a customer and his impression of the service quality from the first to the last stage of a service encounter. This method was successfully utilized by Fitzsimmons and Maurer (1991) to evaluate customers perception of their service experiences in full service restaurants located in Texas. Notably, the customers developed their questions along a sequential order of service encounter. For example, the sections deal with the external conditions, internal conditions, contracts and finances, and the actual services provided (medical, social, religious, etc.), in that order. To save space, the developed questionnaire items via the WTA approach is reserved and they are available upon request. Step 3: Developing the SERVPERF Survey Instrument Generally, question items in a WTA are framed and organized differently than those in a service quality measurement instrument (henceforth called SERFPERF). Firstly, in a WTA, some of the questions may solicit explicit [for example Yes/NO items] rather than opinion responses. As another example, in a WTA, a question on how long a customer waits may present a set of ranges of minutes of wait from which a customer has to choose one. In SERPERF, the question may ask the customer to express levels of agreement on whether the wait was too long. Secondly, while the questions in SERVPERF are grouped under different dimensions (i.e. reliability, empathy, assurance, .), those of the WTA follow the chronological sequence of service delivery (for example, from parking through dining and departure). The strength of going via a WTA to develop SERVPERF is that WTA covers essentially all the quality issues in the specific sense that a customer may encounter. Thus the ensuing SERVPERF will be operationalizable. Guided by the dimensions used by previous authors, together with the definitions of those dimensions, the customers were then asked to reframe, synthesize, combine, etc., the operational items implied in a WTA questions. Further, the customers, initially working in small teams of three, were asked to put items in a dimension they

deem appropriate for the fast food industry. There were no restrictions imposed on both the choice and number of dimensions that could be included by any team. Later, the small teams combined to form a large (jumbo team) to compare notes, deliberate, and reach a consensus. This approach is similar to that used by Llosa et al. (1998), although the latter used individual customers (not teams) and restrict their research to SERVQUAL dimensions. Utilizing the different dimensions of service quality used in previous studies (PZB, 1988, 1993; Cronin and Taylor, 1992; Lappierre, 1996), the combined (jumbo) team reached a consensus on five dimensions (see Table 1) that they felt are most appropriate for measuring service quality in the Assisted Living Industry. As we noted earlier, Lapierres (1996) dimensions include those by Gronroos (1988), which in turn was considered to be a synthesis of other studies in several service contexts. Thus the customers were provided with broad but rich information based on previous studies. Note that although the Accessibility and Flexibility dimension is imbedded in the Empathy dimension in previous studies (PZB, 1988; Lapierre, 1996), the customers conjectured that they should be separated in the fast food industry setting. This approach of developing a service quality instrument via a WTA ensures that the question items in SERFPERF are operational in nature (responses to the questions can be used to reevaluate and improve the service delivery system, as suggested in Lapierre (1996)). CONCLUSION Business educators need to provide students with a solid training in order to arm their students with confidence to face future challenges. Our teaching experiences convince us that students enjoy hands-on experience and they tend to learn more/faster in a learning-by-doing environment. This paper suggests an interesting pedagogical method that can be used in undergraduate (or even graduate) service marketing/management courses. Students were given information about the background of a service industry (e.g., assisted living) and develop a walkthru-audit analysis based on their research. With information from the WTA and past service quality research, students then designed the questionnaire. Accordingly, the ensuing SERVPERF will be operationalizable. However, one should subject the draft questionnaire to a number of marketing practitioners review before it is used to elicit target customers perception about service quality in the studied industry. Focusing on the assisted living industry, as the trend of aging population in the U.S. continues, healthcare organizations need to pay more attention on their service to satisfy the needs of their immediate customers and their family members. The present paper suggests that healthcare managers utilize the proposed WTA approach to identify their strengths/weaknesses and gain a competitive advantage by implementing a superior service program strategically. Future studies need to empirically verify the pre-identified dimensions and provide prescriptive guidance for the healthcare industry.

Table 1: SERVPERF Instrument Developed by the Jumbo Team of Students The following set of statements relate to your feelings about the facility your are evaluating. Circle a number from 1 to 7. A seven (7) means you strongly agree with the statement, a one (1) means you strongly disagree, and so on. There are no wrong answersall we are interested in is the number that best shows your perception about each aspect of the quality of the facility you are evaluating for us. (1) Strongly Disagree (5) Partially Agree (2) Mostly Disagree (3) Partially Disagree(4) Neutral (6) Mostly Agree (7) Strongly Agree

Professionalism & Skills 1. Staff treats each other in a professional manner. 2. Staff is appropriately dressed. 3. Employees appear certified and skillful in performing their tasks. 4. Employees provide solutions appropriate to the problems that may occur. Attitudes & Behavior 5. Staff appropriately greets visitors and competently answers questions. 6. Employees adapt to special needs. 7. Staff interacts well with residents in caring and friendly way. 8a. Employees are always ready to help. 8b. Employees are courteous/respective to residents. Tangibles 9. Proper lighting throughout the community. 10. Grounds are free of trash and clean. 11. The facility is fairly clean. 12. The residences dcor is home-like. 13. There are quality meals and dining service. 14. Visitors waiting area has a homely atmosphere. 15. Adequate guest accommodations are provided. 16. Informational signs are visible upon entering the community. 17. Proper and adequate safety measures are in place (fire/smoke alarms, emergency pull cords, etc.). Accessibility & Flexibility 18. Physicians and nurses are on site regularly. 19. The accommodations are handicap accessible. 20. Adequate transportation is provided to off site activities/appointments. 21. There are ample parking spaces. 22. The facility is sufficiently close to a medical facility in case of emergency. 23. Walkways from the parking lot to the building are clear and safe. 24. Staff is available to provide 24-hour assistance with actives of daily living (ADL) if needed. Ability to Recover 25. Employees react quickly when problems occur. 26. Evacuation procedures are in place and practiced for unforeseen circumstances. 27. Back up equipment and supplies are available in case of emergency. 28. Alternate places/waiting lists are in place to prevent over crowding. 29. Experienced social workers handle difficult situations well.

REFERENCES Cronin, J. Joseph and Steven A. Taylor (1992) Measuring Service Quality: A Reexamination and Extension, Journal of Marketing, (July), 55-67. Fitzsimmons J. A., and Maurer G. B. (1991) A Walk-Through-Audit to Improve Restaurant Performance, The Cornell H.R.A. Quarterly, (Feb), 95-99. Garvin, David A. (1983) Quality on the Line, Harvard Business Review, 61 (SeptemberOctober), 65-73. Garvin, D. A. (1987) Competing on the Eight Dimensions of Quality, Harvard Business Review, (November-December), 101-109. Gronroos, C. (1988),Service Quality: The Six Criteria of Good Perceived Service Quality, Review of Business [St Johns University], 9 (3, Winter), 10-13. Gronroos, C. (1990) Service Management and Marketing, Lexington, MA; Lexington Books. Lapierre, J. (1996) Service Quality: The Construct, Its Dimensionality and Its Measurement, Advances in Services Marketing and Management, 45-69. Lindqvist, L. J. (1987) Quality and Service Value in the Consumption of Services, in Add Value to Your Service: they Key to Success, C. Surprenant, Ed. Chicago: American Marketing Association, 17-20. Llosa, S., J. Chandon, and C. Orsingher (1998) An Empirical Study of SERVQUALs Dimensionality, The Service Industries Journal, 18(2), 16-44. Parasuraman, A., V. A. Zeithaml, and L. L. Berry (1985) A Conceptual Model of Service Quality and Its Implications for Future Research, Journal of Marketing, 49 (Fall), 41-50. Parasuraman, A., V. A. Zeithaml, and L. L. Berry (1988) SERVQUAL: A Multiple-Item Scale for Measuring Consumer Perceptions of Service Quality, Journal of Retailing, 64 (Spring), 12-40. Parasuraman, A., L. L. Berry and V. A. Zeithaml (1993) Research Note: More on Improving Service Quality Measurement, Journal of Retailing, 69(1), 140-7. Parasuraman, A., L. L. Berry and V. A. Zeithaml (1994) Reassessment of Expectations as a Comparison Standard in Measuring Service Quality: Implications for Future Research, Journal of Marketing, 58, (January), 111-24. Quester, P., JW. Wilkinson and S. Romaniuk (1995) A Test of Four Service Quality Measurement Scales: The Case of the Australian Advertising Industry, Working Paper, No. 39, Center de.

Teas, R. K. (1993a) Expectations, Performance Evaluation and Consumers Perceptions of Quality, Journal of Marketing, 57(4), 18-24. Teas, R. K. (1993b) Consumer Expectations and the Measurement of Perceived Service Quality, Journal of Professional Services Marketing, 8(2), 33-53. Zeithaml, V. A., L. L. Berry and A. Parasuraman (1990) Delivering Quality Service: Balancing Customer Perceptions and Expectations, New York: Free Press.

Figure 1: Blueprint for Assisted Community

No

21

20

1 3

9 12

Yes

10

13

14

15

22

5 7
Line of Visibility

*11 8

16

19

2
Independent

Dependent

17

18

Legend for Figure 1 1. Customer calls for an appointment. Failure: Customer forgets appointment. Poka-Yoke: Call 1 day before appointment & ask if transportation is needed. 2. Retirement center schedules appointment. Failure: Customer cant find office. Poka-Yoke: Clear signs & informative signs directing customers. Failure: Cant find building (from house). Poka-Yoke: Offer to pick up, deliver directions, or mail directions. 3. Customer arrives for appointment. 4. Welcome the customer. 5. Interview applicant. Failure: Cant decide which living they want. Poka-Yoke: Joint counseling (counselor repeats his or her understanding of the problem for customer). 6. What kind of care does the customer want? 7. Tour independent residence. Failure: Customer unable to tour physically. Poka-Yoke: Wheelchairs, electric carts, and videotapes to take home or watch here. 8. Tour assisted residence. 9. Does the customer approve? 10. Key in information into computer. Failure: Computer is down. Poka-Yoke: Written documents to support what to put in the computer later. *11. Finalize paper work. 12. Customer checks in. 13. Help customers with moving in. Failure: Short on staff. Poka-Yoke: Contract with other companies to help move in. Delegate other staff members to help or administration will help. 14. Staff performs required work. Failure: Staff not there (called out or did not show). Poka-Yoke: Back up for every position. 15. Give rules and regulations, schedule of events, etc. 16. Introduction to the staff. 17. Work finished. Failure: Customer not satisfied. Poka-Yoke: Ask for customer input, evaluations. 18. Prepare error-free bill. 19. Give monthly bill. Failure: Financial problems. Poka-Yoke: Insurance, government aid, and discounts. 20. Customer pays bill. 21. Thank them and recommend them elsewhere. 22. Services continue.

Vous aimerez peut-être aussi