Vous êtes sur la page 1sur 12

Information & Management 40 (2003) 757768

Quality and effectiveness in Web-based customer support systems


Solomon Negasha,1, Terry Ryanb,*, Magid Igbariab,2
b

Argyros School of Business and Economics, Chapman University, Orange, CA 92866, USA School of Information Science, Claremont Graduate University, 130 E. Ninth Street, Claremont, CA 91711-6190, USA Received 1 August 2001; received in revised form 1 March 2002; accepted 1 October 2002

Abstract The quality of a Web-based customer support system involves the information it supplies, the service it provides, and characteristics of the system itself; its effectiveness is reected by the satisfaction of its users. This paper presents the results of a study of quality and effectiveness in Web-based customer support systems. Data from a survey of 726 Internet users were used to test theoretically expected relationships. The results of this study indicate that information and system quality determine effectiveness while service quality has no impact. Practical implications for managers and designers are offered. # 2002 Elsevier Science B.V. All rights reserved.
Keywords: Web-based systems; Customer support; Information quality; System quality; Service quality; User satisfaction

1. Background Many rms have realized, as their marketplaces have become more global and service oriented, that customer support is critical to their competitiveness. A survey of 350 information systems (IS) executives revealed that connecting to customers and suppliers is one of their top 10 priorities. Among this same group, 60% of the respondents indicated that developing applications to support their customers was the most important focus for their system development efforts [68]. Another survey of 454 information technology
* Corresponding author. Tel.: 1-909-607-9591; fax: 1-909-621-5651. E-mail addresses: negash@chapman.edu (S. Negash), terry.ryan@cgu.edu (T. Ryan). 1 Tel.: 1-714-628-2701; fax: 1-714-532-6081. 2 Deceased.

(IT) managers indicated that 76% of the respondents were evaluated on their ability to use IT effectively to improve customer service [21]. Bill Gates of Microsoft said that he believes that customer service is destined to become the primary value added function in every business. He predicted that Web-based customer support was likely to have signicant advantages when compared to customer support based on traditional media [26]. A 1999 study by the Yankee Group found that close to 80% of large companies currently were providing (or planed to provide) Web-based customer support access [35]. Another study by the Yankee Group found that 77% of their respondents handled 25% of their total customer support requests via the Web [79]. Today, Microsofts Web-based customer support gets over 100,000 unique customer visits per day. By handling this volume of customer support online, Microsoft has been able to maintain a constant level of

0378-7206/$ see front matter # 2002 Elsevier Science B.V. All rights reserved. PII: S 0 3 7 8 - 7 2 0 6 ( 0 2 ) 0 0 1 0 1 - 5

758

S. Negash et al. / Information & Management 40 (2003) 757768

phone support during a period of sales growth. At Novell, Web-based customer support has reduced phone support by 45%. At Network Associates and Great Plains, Web-based customer support has reduced call volume by 37 and 20%, respectively [1]. Many rms are taking advantage of the Webbased technologies to give customers direct access to their customer support knowledge base [13]. With the option of self-service, customers can choose to access support knowledge directly through the Internet. Webbased customer support systems can be used for internal and external customer support [46,75]. The Web environment allows consumers to recover from their mistakes [43] and to overcome some difculties associated with traditional media; e.g. it may ameliorate problems of accessibility, bottlenecks, interaction, and identication. It also promises potential benets for rms, including reduced transaction costs, reduced time to complete transactions, reduced clerical errors, faster responses to new market opportunities, improved monitoring of customer choices, improved market intelligence, more timely dissemination of information to stakeholders, and more highly customized advertising and promotion [6]. It is clear that Web-based customer support systems are important. What is not so clear is: what makes Web-based customer support systems effective? This paper identies, and empirically establishes, the quality dimensions that underlie Web-based customer support system effectiveness. Quality is a characteristic of a product or service that reects how well it meets the needs of its consumers; as such, it is associated with product or service satisfaction [55]. Although consumers ultimately decide whether quality exists, organizations aspire to provide the features that their customers require [22]. The judgments that consumers make about product or service quality are based on what they feel they need. Organizations attempt to determine customer requirements and then to treat these results as guidelines for their products and services. For many rms, improved quality has become an essential ingredient for successful competition [23], while the standards by which their quality is judged have become global. In the IS realm, measurements of several dimensions of quality (information quality, system quality, and service quality) have been used to assess IS [80].

Quality has been described as conformance to requirements, while satisfaction has been dened as conformance to expectations. Under ideal circumstances, there would be no difference between consumer judgment of quality and experienced satisfaction, but circumstances are often less than ideal. Users may have different requirements and meeting the requirements of all users is challenging. The situation is even more difcult in a Web-based customer support environment. The Web has a vast number of users, a non-homogenous group, who are not conned by organizational context and for whom use of the Web is optional. Firms that wish to build successful Web systems must attract users to visit and revisit their sites voluntarily. And potential users can be very different. Their needs are difcult to dene, much less to meet. However, providing quality, meeting users expectations, is critical for the success of Web-based customer support systems [11]. Studying the impacts of customer support, not just through the Web, has been a major focus in the marketing literature [7,9,5153,82]. Customer support includes a technical aspect (i.e. were things done correctly?) and a relationship aspect (i.e. was the customer treated properly?) [8]. Customer support may be used to avoid adversity and build long-term relationships [54]. The technical component of customer support is addressed by both information and system quality, the relationship aspect by service quality.

2. Research model DeLone and McLean formulated an IS Success model using information and system quality to determine the effectiveness of an IS [15]. Pitt et al. adapted the DeLone and McLean model to customer service settings by adding a service quality component [62]. This paper focuses on the three quality dimensions of the Pitt et al. model, as well as user satisfaction. User satisfaction is used as a measure of Web-based customer support system effectiveness. The research model is presented in Fig. 1. 2.1. Information quality Information quality is a function of the value of the output produced by a system as perceived by the user.

S. Negash et al. / Information & Management 40 (2003) 757768

759

Fig. 1. The impact of quality on Web-based customer support system effectiveness.

Many different information characteristics have been viewed as important determinants of information quality perception including: accuracy, precision, currency, output timeliness, reliability, completeness, conciseness, format, and relevance [2]; understandability [70]; report usefulness [49]; and sufciency, freedom from bias, comparability, and quantitativeness. In this study, information quality was viewed as having two aspects: informativeness and entertainment. These two inuence the value customers receive in a Web-based interface [18]. Informativeness includes information accuracy, relevance, timeliness, convenience, and completeness. Entertainment involves whether the interface is entertaining, enjoyable, pleasing, fun, and exciting. Here, informativeness and entertainment are dened as follows.  Informativeness: the ability to inform customers about the product alternatives. This is considered along with confirmation on whether the information received is relevant, timely, up-to-date, and from an accurate source.  Entertainment: the ability of the medium to fulfill audience needs for escapism, diversion, aesthetic enjoyment, or emotional release. The entertainment features, specifically whether the Web-based system is entertaining, fun, and enjoyable, were closely examined. These denitions are consonant with a number of variables related to information quality employed in the IS and marketing literatures, including information accuracy and timeliness [44], information availability and site friendliness [78], and playfulness [45]. In

these terms, better information quality has been thought to lead to better customer support. Thus we have the following Hypothesis 1. Hypothesis 1. Information quality is positively associated with Web-based customer support system effectiveness. 2.2. System quality System quality is a measure of the information processing system itself. Here, we use interactivity and access to stand for system quality. The determining criteria in assessment of system quality are the performance characteristics of the systems under study. These include resource utilization [41]; reliability, response time, and ease of terminal use [72]; and data accuracy, reliability, completeness, system exibility, and ease of use [32]. When people use an IS, their experience, which is with all the systems features not just the information it provides, affects their attitudes [40]. Users perceptions of the features of a Web-based customer support system should affect their perceptions of the support they receive. Quality of system design, which encompasses system features, has been identied as critical for the success of a Web site design. Communications between a rm and its customers, other than face-to-face discussions, take place through one or more media, via interactions with the media by both parties. The features of a Web-based interface make it an attractive choice as a medium for interaction between the rm and its customers [33].

760

S. Negash et al. / Information & Management 40 (2003) 757768

Interactivity is the extent to which users can participate in modifying the form and content of a mediabased environment in real time [71]. Here, interactivity and access are dened as follows.  Interactivity: the extent to which users can participate in modifying the form and content of a mediabased environment. Judging interactivity involves determining whether the Web-based system has quick feedback, multiple alternatives, and predictable screen changes.  Access: the availability of the system when customers try to retrieve information, along with the ease of using the interface to contact people needed for support. Judging the accessibility of a Web-based customer support system involves determining whether the system is quick to respond, makes it easy to retrieve information, and makes it easy to contact management. We expected that increases in system quality will increase the effectiveness of a Web-based customer support system, leading to Hypothesis 2. Hypothesis 2. System quality is positively associated with Web-based customer support system effectiveness. 2.3. Service quality Information and system quality focus on the system and its outputs as products. However, service quality is a different issue. Service quality problems are fundamentally different. Many services are consumed even as they are being produced. This makes it difcult to control and deliver consistent service. Parasuraman et al. [59] dene service quality in terms of the difference between expected and perceived service. They state: the key to ensuring good service quality is meeting or exceeding what (customers) expect from the service (p. 46). Parasuraman et al. [60] developed an instrument, SERVQUAL, for measuring service quality. The ve factors included in SERVQUAL are: tangibles, reliability, responsiveness, assurance, and empathy, dened as follows.  Tangibles: refers to the physical features of the system, such as whether the system is appealing and looks good.

 Reliability: involves the systems consistency of performance and dependability, focusing on whether the system is right, useful, and dependable.  Responsiveness: concerns the apparent readiness of the system to provide service, paying special attention to whether the system is prompt and tells when to expect service.  Assurance: refers to the knowledge and courtesy expressed in the system and its ability to inspire trust and confidence in its safety.  Empathy: involves the care and individualized attention the system gives its customers, including whether the interface medium gives individual attention, is a good means to communicate, and has convenient hours. Quality of service is thought by some to be the core criterion for overall customer service [58]. Concern has been expressed that SERVQUAL may suffer from conceptual and empirical aws [77], although these concerns may be disputed [63] or viewed as an opportunity for instrument improvement, rather than fatal aws [37]. In a Web-based customer support system, service quality acts as an enabler or impediment of the customer support experience. It is therefore hypothesized that better service quality will positively inuence the effectiveness of a Web-based customer support system. Hypothesis 3. Service quality is positively associated with Web-based customer support system effectiveness. 2.4. Effectiveness IS success is notoriously difcult to assess, but there have been many efforts by the IS research community to determine how to do so. Included in these efforts have been instruments created to assess perceived usefulness of MIS [24], user information satisfaction [5], perceived usefulness and ease of use [14], end user computing satisfaction [17], and tasktechnology t [30]. Based on a review of the IS literature, Yoon et al. [81] proposed ve categories of system success: business protability, improved decision quality and performance, perceived benet of systems, level of

S. Negash et al. / Information & Management 40 (2003) 757768

761

system usage, and user satisfaction. DeLone and McLean postulated six aspects of IS success: system quality, information quality, system use, individual impact, organizational impact, and user satisfaction. In the customer satisfaction literature, satisfaction is a distinct concept, different from (perhaps, in part, determined by) service quality [57]. Seddon et al. noted that it is meaningful to consider IS success only from a particular perspective and in terms of a specic evaluation target. They said that ve points of view may be adopted in assessing IS success: the independent observer, the individual, the group, the managers or owners, and the country. The target of evaluation can be one of six elements: a feature of an IT application, a particular IT application, a kind of IT application, a set of IT applications used by an organization, a methodology for developing IT applications, or the organizational function responsible for IT applications. Here, effectiveness of a particular type of IT (the Web-based customer support system) is the target for evaluation, while the assessment to be made is from the perspective of the individual user, thus user satisfaction is the surrogate measure most appropriate. Other researchers have also used user satisfaction as a surrogate for effectiveness [19,25,50]. Keeping customers satised is the basis of business success. Reich and Benbasat [67] studied factors inuencing the success of customer oriented strategic systems implementations. They measured implementation success in terms of timely development, high adoption by customers, and involvement in the competitive position of the company. Although the target system that they studied is similar to the one investigated in our research, the viewpoint (i.e. that of the rm) was different. These kinds of outcome measures, while appropriate in their study, were less appropriate for us. Yoon et al. studied the determinants of expert systems success. Their research looked at the success of a type of IT application from the perspective of the individual. They chose to employ a measure of user satisfaction to indicate IS success. The measure they used was one they derived from an instrument [66] descended from a widely used earlier measure [34]. A number of other constructs exist that are different from user satisfaction, but still concern users responses to their IS: system usage [4], user involvement [3], and user acceptance. Some researchers

prefer multiple measures of IS success, depending on the task, but there is much support for the use of user satisfaction alone. It can be viewed as a highly useful substitute for objective measures of system success [31] that has been employed by enough researchers to make it a basis for inter-study comparisons. We used user satisfaction to indicate the effectiveness of a Web-based customer support system, determined by the information quality, system quality, and service quality of the system. Higher levels of user satisfaction are assumed to correspond to higher levels of Web-based customer support system effectiveness. The survey items used to measure satisfaction employ disconrmation scales (i.e. scales couched in terms of the evaluation target being better than expected or worse than expected, rather than in terms of the target being poor or excellent or in terms of the respondent being very satised or very dissatised). Such scaling is preferred for measurement of customer satisfaction in terms of discriminant and convergent validity, as well as lessened asymmetry of responses [12]. Response asymmetry has been a problem with nearly all satisfaction measures; they produce negatively skewed distributions where the majority of responses indicate a satised customer [61].

3. Methodology Our research employed a Web-based survey; this was thought to be an appropriate way to gather data from users of Web-based customer support systems. Web-based surveys have been used in prior research [73,74]. Tan and Teo list several characteristics that make a Web-based survey different from mail surveys, including sampling frame, quality of data, response rate, and length of time needed. The target audience for our survey included students taking Web-related courses in business, management, IS, or computer science. The fundamental question to be asked before measuring user satisfaction is: whose satisfaction should be measured and with what object? [69]. User satisfaction measures have been applied in many IS research settings: satisfaction of chief executives on the success of the overall MIS effort [10]; user

762

S. Negash et al. / Information & Management 40 (2003) 757768

satisfaction on implementation success [27,28]; sales representative satisfaction with a new computer system [47]; executive satisfaction with an IS which aided decisions [48]; manager satisfaction on project success [64]; manager satisfaction as imputation of IS value [39]; and user satisfaction for the effectiveness of group decision support system [16]. Because our objective was to examine how satised users of a Web-based customer support system were with the particular application they used, only people who had used a Web-based customer support system participated. Satisfaction evaluation, unlike service quality perception assessment, requires experience with the service. Our Web-based survey consisted of items using a 5point Likert-type scale having values ranging from (1) strongly disagree to (5) strongly agree. Two pilot studies were conducted with Web-based customer support users from Fortune 2000 companies. Based on feedback from these, the format and readability of the surveys questions were modied. Two professors validated the appropriateness of the instrument for the student population. 3.1. Data collection A list of e-mail address for 54 instructors from 35 universities was compiled from a Web site (http:// dir.yahoo.com/eucation/hgher_eucation/clleges_and_uiversities/) and from publications of various academic IS organizations. The criterion for selecting instructors was that they were scheduled to teach a class in IS, computer science, or business during January and February 2001. Thirty-one instructors representing 22 universities agreed to participate. The instructors agreed to give class credit to their students as an incentive to participate in the survey. The link to the online survey (URL) was sent to the instructors; they, in turn, distributed the URL to their students. We received 886 responses. A large number of cases (120) were removed due to missing responses. These large numbers of missed responses seem to be random in nature, perhaps due to participants clicking outside the radio button range. In future, this oversight could be avoided by including program code that would remind participants about the missed responses before submitting their answers. In the end, 726 (82%) usable responses resulted.

3.2. Demographic data Demographic statistics for the respondents are shown in Table 1. The gender ratio was 45% female and 55% male with majority of the students (75%) under 25 years of age, as might be expected from a college population. We had 8% graduate students with the balance undergraduate. Frequency of Web-based customer support usage (i.e. the number of times a respondent used his or her Web-based customer support
Table 1 Demographic data for respondents Count Number of respondents Gender Female Male Not selected Age Under 25 Over 25 Not selected Education level Undergraduate Graduate Not selected Major Information systems Computer science Business Accounting MBA Other Not selected Computing experience Under 3 years 35 years Over 5 years Not selected E-commerce experience Under 3 months Over 3 months Not selected Web-based support usage frequency Less than once a month A few times a month A few times a week Several times a day Not selected 726 324 401 1 547 177 2 669 55 2 303 11 177 96 22 61 56 237 174 240 75 160 389 177 226 232 137 127 4 Percentage 100 45 55 75 25 92 8 42 2 24 13 3 8 8 33 24 33 10 22 54

31 32 19 17 1

S. Negash et al. / Information & Management 40 (2003) 757768

763

system), e-commerce experience, and computing experience were also gathered.

0.449, respectively. Hypotheses 1 and 2 are supported, at the 0.05 and the 0.01 level, respectively. Hypothesis 3 was not supported; service quality had no signicant effect on satisfaction.

4. Data analysis The factors and the corresponding items are shown in Table 2. All items except satisfaction and interactivity, were from previously validated instruments. Satisfaction items were modications of existing items to employ a disconrmation scaling approach. Items for interactivity were derived from the literature. All items employed were validated during the pilot tests. As part of instrument testing during pilot studies, it was determined that one of the three items in the scale for satisfaction did not correlate with the others, probably due to its negative wording. In the interest of higher reliability for the scale, this item was dropped.3 The wording for all items was modied for a Web-based customer support environment. 4.1. Reliability The theoretical model is a structural equation model (SEM). The model was tested for t with EQS for Windows (version 5.5), a covariance-based SEM tool. Path analysis was conducted for each of the hypothesized relationships. The reliability estimates for all but two factors were above the recommended 0.70 level [56]; these factors had reliability coefcients of 0.69. The complete results of reliability analyses are provided in Table 3. 4.2. Hypotheses results Fig. 2 shows the research model, including the estimated path coefcients. As hypothesized, satisfaction was associated with information quality and system quality, which together explained 51% of the dependent constructs variance. Both paths had positive effects, with path coefcients of 0.348 and
Although scales having more items often display higher reliability, it is not necessary to have more than two scores to determine reliability [76]. The correlation coefficient is close, conceptually, to the reliability coefficient. The square of either can be interpreted as a coefficient of determination [36].
3

5. Discussion and conclusions This study tested a model concerning how various dimensions of quality affect Web-based customer support system effectiveness. 5.1. Methodological issues Two general areas of concern are always present in studies that employ ratings of IS characteristics by individuals. One is whether the instruments were valid measures of the constructs. The other is whether participants responses were truly targeted at the objects of interest. Here, one could ask whether an appropriate measure of satisfaction was employed. Additionally, one could question whether the Web-based customer support systems were well dened. It is reasonable to ask whether Web-based customer support systems are sufciently different from other IS to be differentiated. This study focused on Web-based customer support systems, because rms and IS managers have an interest in using IT to support customer service and have recently begun to apply the technology specically to the problem. Prior research examined satisfaction with systems that were, for the most part, internal to their host rms and oriented toward particular functions of the rm. A Web-based customer support system is open to an unlimited number of customers, who may need support before, during, and after their transactions. Opening up a system to the whole worldmaking it available to an unlimited number of usersis sure to pose novel challenges in assuring user satisfaction. Additionally, Web-based systems would present differences in terms of the features they may present, the information they could provide, the ways that they can be used, and the aspects of interaction with non-system components of the rm that they may replace, enhance, or automate. But did the research study participants consider the customer support system separately from other

764 Table 2 List of factors and items Construct Information quality Factors

S. Negash et al. / Information & Management 40 (2003) 757768

Items Accurate source of information Provides timely information Has up-to-date information Entertaining Enjoyable Fun to use Provides quick feedback Gives me a variety of alternatives for solving my problem Has a natural and predictable screen changes Responds quickly during the busy hours of the day Makes it easy to contact the customer support manager Makes it easy to get to customer support information Has a modern looking interface Has visually appealing features Has visually appealing materials Provides the right solution to my request Presents a useful alternative to solve my problem Dependable Tells me exactly when services will be performed Gives me prompt service I trust the Web-based support interface I feel safe when making transactions Does not give me individual attention Not a good interface to communicate my needs Does not have convenient operating hours My overall satisfaction level with regard to the Web-based support interface I use is better than what I expected The overall quality of the Web-based support interface I use was better than I thought it would be

Label IQINFO01 IQINFO03 IQINFO04 IQENTE01 IQENTE02 IQENTE04 SFINTE01 SFINTE02 SFINTE03 SFACCE01 SFACCE02 SFACCE03 SQTANG01 SQTANG02 SQTANG04 SQRELI01 SQRELI02 SQRELI03 SQRESP01 SQRESP02 SQASSU01 SQASSU02 SQEMP01R SQEMP02R SQEMP03R SATISF01 SATISF03

Informativeness

Entertainment

System quality

Interactivity

Access

Service quality

Tangible reliability

Responsiveness Assurance Empathy

Satisfaction

User satisfaction

systems provided by rms? Here, respondents were asked to provide information about actual Web-based customer support systems that they had used. No respondents indicated that the task was difcult or confusing. In fact, during the pilot studies, respondents who worked professionally in customer support indicated explicitly that the tasks involved were clear to them. Based on these observations, and the procedures employed in the study (such as the detailed instructions given to the instructors of the courses from which participants were drawn), it is fair to conclude that the responses made by participants reected their opinions about their actual experiences. Limiting participation to only those potential respondents who had

Table 3 Reliability results Measures Informativeness Entertainment Interactivity Access Tangibles Reliability Responsiveness Assurance Empathy Satisfaction
a

Number of items 3 3 3 3 3 3 2 2 3 2

Reliabilitya 0.810 0.904 0.693 0.778 0.786 0.750 0.754 0.809 0.684 0.837

Cronbachs alpha.

S. Negash et al. / Information & Management 40 (2003) 757768

765

Fig. 2. Quality dimension model for Web-based customer support system.

actually interacted with rms over the Web added to condence about the targets of respondents evaluations. Examination of all Web sites mentioned by respondents in their extended comments allowed verication that they were considering sites for customer support as they made their responses.

5.2. Observations regarding the model A signicant relationship between information quality and system effectiveness was found, indicating that the effectiveness of a Web-based customer support system increases as the quality of information is

766

S. Negash et al. / Information & Management 40 (2003) 757768

improved. Web-based customer support system providers need to consider additional information quality features in comparison with traditional customer support systems, which often use printed reports to evaluate information quality. Depending on the degree of participation, users perceptions of information quality may vary. In areas where users participate in updating information, they may rate the information quality higher. One way in which a Web-based customer support system can increase overall information quality is to reduce input uncertainty, which occurs due to unclear descriptions by users of the problems they encounter [65]. A Web-based customer support system that provides a self-service option can help articulate users questions. Questions that are not answered through self-service may become more rened by the time the user seeks support from staff members. The ndings of this study also show that system quality is associated with user satisfaction. It is easy to believe that users are satised when system quality meets their expectations. On the other hand, the relationship must be context dependent. Users views of system quality differ depending on what is being supported. For example, a short down time in a mission critical system may negatively inuence users perceptions of system quality. On the other hand, a similar duration of down time in an information only system may be considered normal, and may not inuence users views of system quality. This study did not nd a relationship between service quality and system effectiveness. One possible explanation is that the SERVQUAL instrument did not apply to the IS eld [38]. There may be more factors to consider than the ve included in SERVQUAL. Other concepts, such as upward communication, levels of management, task standardization, teamwork, employeejob t, technologyjob t, and role ambiguity may all be important to consider in understanding the relationship between service quality and effectiveness from the IS perspective. Service quality introduces different challenges than information quality and system quality do. In addition, the unit of analysis for service quality can change with the users circumstances. For example, when the system is down, the user may become more concerned with service from the IS department, changing the unit of analysis from the designed system to the IS department.

In conclusion, the results obtained in this study have some important implications for managing Web-based customer support systems. Participants of the customer support system including managers, designers, and support staff [20] will benet by including quality features in the design [29]. This will help address managements concerns of cost control and success strategy [42]. References
[1] The Association of Support Professionals (ASP), This years 10 best Web support site, http:/www.asponline.com, 23 March 2000. [2] J.E. Bailey, S.W. Pearson, Development of a tool for measuring and analyzing computer user satisfaction, Management Science 29 (5), 1983, pp. 530545. [3] H. Barki, J. Hartwick, Rethinking the concept of user involvement, MIS Quarterly 13 (1), 1989, pp. 5363. [4] J.J. Baroudi, M.H. Olson, B. Ives, An empirical study of the impact of user involvement on system usage and information satisfaction, Communications of the ACM 29 (3), 1986, pp. 232238. [5] J.J. Baroudi, W.J. Orlikowski, A short form measure of user information satisfaction: a psychometric evaluation and notes on use, Journal of Management Information Systems 4 (4), 1988, pp. 4459. [6] R.C. Beatty, J.P. Shim, M.C. Jones, Factors inuencing corporate Web site adoption: a time-based assessment, Information and Management 38 (6), 2001, pp. 337354. [7] M.J. Bitner, Evaluating service encounters: the effects of physical surroundings and employee responses, Journal of Marketing 54 (2), 1990, pp. 6982. [8] D.E. Bowen, C. Siehl, B. Schneider, A framework for analyzing customer service orientations in manufacturing, Academy of Management Review 14 (1), 1989, pp. 7595. [9] R.B. Chase, Where does the customer t in a service operation? Harvard Business Review 56 (6), 1978, pp. 137142. [10] M.A. Company, Unlocking the computers prot potential, McKinsey Quarterly (1968) 731. [11] J. DAmbra, R.E. Rice, Emerging factors in user evaluation of the World Wide Web, Information and Management 38 (6), 2001, pp. 373384. [12] P.J. Danaher, V. Haddrell, A comparison of question scales used for measuring customer satisfaction, International Journal of Service Industry Management 7 (4), 1996, pp. 426. [13] T.H. Davenport, P. Klahr, Managing customer support knowledge, California Management Review 40 (3), 1998, pp. 195208. [14] F.D. Davis, Perceived usefulness, perceived ease of use, and user acceptance of information technology, MIS Quarterly 13 (3), 1989, pp. 319340. [15] W.H. DeLone, E.R. McLean, Information systems success: the quest for the dependent variable, Information Systems Research 3 (1), 1992, pp. 6095.

S. Negash et al. / Information & Management 40 (2003) 757768 [16] G. DeSanctis, R.B. Gallupe, A foundation for the study of group decision support systems, Management Science 33 (5), 1987, pp. 589609. [17] W.J. Doll, G. Torkzadeh, The measurement of end user computing satisfaction, MIS Quarterly 12 (2), 1988, pp. 259 274. [18] R.H. Ducoffe, Advertising value and advertising on the Web, Journal of Advertising Research 36 (5), 1996, pp. 2135. [19] P. Ein-Dor, E. Segev, Organizational context and the success of management information systems, Management Science 24 (10), 1978, pp. 10641077. [20] O.A. El Sawy, G. Bowles, Redesigning the customer support process for the electronic economy: insights from storage dimensions, MIS Quarterly 21 (4), 1997, pp. 457483. [21] B. Evans, Ratings risenumbering success, Information Week 12, 1996, pp. 6. [22] A.V. Feigenbaum, Total Quality Control, McGraw-Hill, New York, 1983. [23] L.Y. Fok, W.M. Fok, S.J. Hartman, Exploring the relationship between total quality management and information systems development, Information and Management 38 (6), 2001, pp. 355371. [24] C.R. Franz, D. Robey, Organization context, user involvement, and usefulness of information systems, Decision Sciences 17 (3), 1986, pp. 329356. [25] D.F. Galletta, A.L. Lederer, Some cautions on the measurement of user information satisfaction, Decision Sciences 20 (3), 1989, pp. 419438. [26] W. Gates, Business at the Speed of Thought, Warner Books, New York, 1999. [27] M.J. Ginzberg, Early diagnosis of MIS implementation failure: promising results and unanswered questions, Management Science 27 (4), 1981a, pp. 459478. [28] M.J. Ginzberg, Key recurrent issues in the MIS implementation process, MIS Quarterly 5 (2), 1981, pp. 4760. [29] K. Gofn, Evaluating customer support during new product development: an exploratory study, Journal of Product Innovation Management 15 (1), 1998, pp. 4256. [30] D.L. Goodhue, Understanding user evaluations of information systems, Management Science 41 (12), 1995, pp. 18271844. [31] T. Guimaraes, Y. Gupta, Measuring top management satisfaction with the IS department, Omega 16 (1), 1988, pp. 1724. [32] S. Hamilton, N.L. Chervany, Evaluating information system effectiveness. I. Comparing evaluation approaches, MIS Quarterly 5 (3), 1981, pp. 5569. [33] D.L. Hoffman, T.P. Novak, A new marketing paradigm for electronic commerce, Information Society 13 (1), 1997, pp. 4354. [34] B. Ives, M.H. Olson, J.J. Baroudi, The measurement of user information satisfaction, Communications of the ACM 26 (10), 1983, pp. 785793. [35] E. Kay, http:/www.wheres myorder?com, Computerworld 33 (29) (1999) 8283. [36] F.N. Kerlinger, H.B. Lee, Foundations of Behavioral Research 4th ed., Harcourt College Publishers, Ft. Worth, TX, 2001.

767

[37] W.J. Kettinger, C.C. Lee, Perceived service quality and user satisfaction with the information services function, Decision Sciences 25 (5/6), 1994, pp. 737766. [38] W.J. Kettinger, C.C. Lee, Pragmatic perspectives on the measurement of information systems service quality, MIS Quarterly 21 (2), 1997, pp. 223240. [39] W.R. King, B.J. Epstein, Assessing information system value: an experimental study, Decision Sciences 14 (1), 1983, pp. 3445. [40] R.E. Kraut, S.T. Dumais, S. Koch, Computerization, productivity, and quality of work-life, Communications of the ACM 32 (2), 1989, pp. 220238. [41] C. Kriebel, A. Raviv, An economics approach to modeling the productivity of computer systems, Management Science 26 (3), 1980, pp. 297311. [42] M.M. Lele, How service needs inuence product strategy, Sloan Management Review 28 (1), 1986, pp. 6370. [43] V. Lenz, The Saturn Difference: Creating Customer Loyalty in Your Company, Wiley, New York, 1999. [44] H.K.N. Leung, Quality metrics for Internet applications, Information and Management 38 (3), 2001, pp. 137152. [45] C. Liu, K.P. Arnett, Exploring the factors associated with Web site success in the context of electronic commerce, Information and Management 38 (1), 2000, pp. 2333. [46] J. Longoria, Focus on: help desk applications and technologies, Telemarketing and Call Center Solutions 14 (8), 1996, pp. 5255. [47] H.C.J. Lucas Jr., Empirical evidence for a descriptive model of implementation, MIS Quarterly 2 (2), 1978, pp. 2741. [48] H.C.J. Lucas Jr., An experimental investigation of the use of computer-based graphics in decision-making, Management Science 27 (7), 1981, pp. 757768. [49] M.A. Mahmood, J.N. Medewitz, Impact of design methods on decision support system success: an empirical assessment, Information and Management 9 (3), 1985, pp. 137151. [50] M.A. Mahmood, J.A. Sniezek, Dening decision support systems: an empirical assessment of end-user satisfaction, Information and Management 27 (3), 1989, pp. 253271. [51] P.K. Mills, R.B. Chase, N. Margulies, Motivating the client/ employee system as a service production strategy, Academy of Management Review 8 (2), 1983, pp. 301310. [52] P.K. Mills, N. Margulies, Toward a core typology of service organizations, Academy of Management Review 5 (2), 1980, pp. 255265. [53] P.K. Mills, D.J. Moberg, Perspectives on the technology of service operations, Academy of Management Review 7 (3), 1982, pp. 467478. [54] M.H. Morris, D.L. Davis, Measuring and managing customer service in industrial rms, Industrial Marketing Management 21 (4), 1992, pp. 343353. [55] P.J.A. Nagel, W.W. Cilliers, Customer satisfaction: a comprehensive approach, International Journal of Physical Distribution and Logistics Management 20 (6), 1990, pp. 246. [56] J.C. Nunnally, Psychometric Theory, McGraw-Hill, New York, 1978. [57] R.L. Oliver, A conceptual model of service quality and service satisfaction: compatible goals, different concepts, in:

768

S. Negash et al. / Information & Management 40 (2003) 757768 T.A. Swartz, D.E. Bowen, S.W. Brown (Eds.), Advances in Services Marketing and Management, vol. 2, JAI Press, Greenwich, CT, 1993, pp. 6585. A. Parasuraman, L.L. Berry, V.A. Zeithaml, Renement and reassessment of the SERVQUAL scale, Journal of Retailing 67 (4), 1991, pp. 420450. A. Parasuraman, V.A. Zeithaml, L.L. Berry, A conceptual model of service quality and its implications for future research, Journal of Marketing 49 (4), 1985, pp. 4150. A. Parasuraman, V.A. Zeithaml, L.L. Berry, SERVQUAL: a multiple-item scale for measuring consumer perceptions of service quality, Journal of Retailing 64 (1), 1988, pp. 1240. R.A. Peterson, W.R. Wilson, Measuring customer satisfaction: fact and artifact, Journal of the Academy of Marketing Science 20, 1992, pp. 6171. L.F. Pitt, R.T. Watson, C.B. Kavan, Service quality: a measure of information systems effectiveness, MIS Quarterly 19 (2), 1995, pp. 173187. L.F. Pitt, R.T. Watson, C.B. Kavan, Measuring information systems service quality: concerns for a complete canvas, MIS Quarterly 21 (2), 1997, pp. 209221. R.F. Powers, G.W. Dickson, MIS project management: myths, opinions and reality, California Management Review 15 (3), 1973, pp. 147156. S. Rathnam, V. Mahajan, A.B. Whinston, Facilitation coordination in customer support teams: a framework and its implications for the design of information technology, Management Science 41 (12), 1995, pp. 19001921. L. Raymond, Organizational characteristics and MIS success in the context of small business, MIS Quarterly 9 (1), 1985, pp. 3753. B.H. Reich, I. Benbasat, An empirical investigation of factors inuencing the success of customer-oriented strategic systems, Information Systems Research 1 (3), 1990, pp. 325347. R. Savoia, Custom tailoring, CIO 9 (17), 1996, pp. 112. P.B. Seddon, S. Staples, R. Patnayakuni, M. Bowtell, Dimensions of information systems success, Communications of the AIS 20 (2) (1999). A. Srinivasan, Alternative measures of systems effectiveness: associations and implications, MIS Quarterly 9 (3), 1985, pp. 243253. J. Steuer, Dening virtual reality: dimensions determining telepresence, Journal of Communication 42 (4), 1992, pp. 7393. E.B. Swanson, Management information systems: appreciation and involvement, Management Science 21 (2), 1974, pp. 178188. M. Tan, T.S.H. Teo, Factors inuencing the adoption of Internet banking, Journal of the AIS 5 (1) (2000). T.S.H. Teo, V.K.G. Lim, R.Y.C. Lai, Intrinsic and extrinsic motivation in Internet usage, Omega 27 (1), 1999, pp. 2537. F. Tourniaire, R. Farrell, The Art of Software Support, Prentice-Hall, Upper Saddle River, NJ, 1997. W.M.K. Trochim, The Research Methods Knowledge Base, 2nd ed., Atomic Dog Publishing, Cincinnati, 2001. T.P. Van Dyke, L.A. Kappelman, V.R. Prybutok, Measuring information systems service quality: concerns on the use of the SERVQUAL questionnaire, MIS Quarterly 21 (2), 1997, pp. 195208. H.A. Wan, Opportunities to enhance a commercial Web site, Information and Management 38 (1), 2000, pp. 1522. R. Weston, Web support we trust? Information Week 741, 1999, pp. 117118. B.H. Wixom, An empirical investigation of the factors affecting data warehousing success, MIS Quarterly 25 (1), 2001, pp. 1741. Y. Yoon, T. Guimaraes, Q. ONeal, Exploring the factors associated with expert systems success, MIS Quarterly 19 (1), 1995, pp. 83106. V.A. Zeithaml, L.L. Berry, A. Parasuraman, The nature and determinants of customer expectations of service, Journal of the Academy of Marketing Science 21 (1), 1993, pp. 112. Solomon Negash is a Visiting Assistant Professor of MIS at the Argyros School of Business and Economics, Chapman University. He received his PhD from Claremont Graduate University. His research interests include Web-based customer support, knowledge transfer, and business intelligence.

[78] [79] [80]

[58]

[59]

[60]

[81]

[61]

[82]

[62]

[63]

[64]

[65]

[66]

[67]

[68] [69]

[70]

Terry Ryan is an Associate Professor of Information Science at Claremont Graduate University. He received his PhD from Indiana University. Ryans publications have appeared in DATA BASE (ACM), Interface, Journal of Computer Information Systems, Journal of Database Management, and Journal of Systems Management. His research interests include Web-based customer support, information systems in the state government sector, discontinuous changes in Web sites, and the development and evaluation of information systems. Magid Igbaria died 3 August 2002, after a lengthy illness. Magid earned a PhD from Tel Aviv University. He held the rank of Professor at both Claremont Graduate University and Tel Aviv University. He was ranked as the most productive researcher in the IS field in a number of studies. He published over 100 articles on topics such as e-commerce, virtual workplace, computer technology acceptance, IS personnel, and the management of IS. Magid was loved by his family, friends, students, and colleagues. His life was one of compassion, intelligence, energy, humility, and dedication. He will be missed.

[71] [72]

[73] [74] [75] [76] [77]

Vous aimerez peut-être aussi