Vous êtes sur la page 1sur 50

SOFTWARE QULAITY ASSURANCE & MANAGEMENT

A CASE STUDY ON

WEB METRICS FRAME WORK FOR IMPROVING QUALITY ASSURANCE


(Phase 3)

Presented To:
Dr. Qaiser S. Durrani

Presented By:
Muhammad Irfan Khan (07L-0858)
Saqib Aziz (07L-0861)
Ahmad Mohsin (07L-0852)
Case Study
Phase III
Table of Contents

Objectives of Study (Revised)...................................................................................................4


To define a framework for quality assurance.............................................................................5
To identify Web Metrics that will boost the quality of a product..............................................5
To understand the failure causes for web application................................................................5
To develop a model that ensures adherence to software product standards, processes and
procedures. To assure that standards and procedures are established and are followed
throughout the software acquisition life cycle. .........................................................................5
To capture standards that will directly affect a business function.............................................5
To investigate the issues, and present a distinguishable model.................................................5
To measure the subject adherence of web pages.......................................................................5
To measure the content quality of web pages............................................................................5
To measure the popularity of web pages....................................................................................5
To measure the linkage between web pages..............................................................................5
To measure the similarity between web pages...........................................................................5
Problem Statement (Revised)....................................................................................................6
Problem Elaboration (Revised)..................................................................................................7
Scope..........................................................................................................................................8
Prior Literature (Revised)........................................................................................................11
Research Methodology (Revised)............................................................................................13
Develop Web Metrics Framework.....................................................................................14
Detailed metrics of performance, security and usability were developed to help us measure
the CRMs from different quality aspects. This detailed framework is explained in the later
section................................................................................................................................14
Selection of CRMs.............................................................................................................14
Data Collection and Analysis.............................................................................................19
Size...................................................................................................................................20
No. of Projects under Study.............................................................................................20
Interpretation of Data.........................................................................................................20
Research Report for Findings............................................................................................20
Development of Hypotheses (Revised)...................................................................................21
Hypotheses:..............................................................................................................................21
Hypothesis I.......................................................................................................................21
Null Hypothesis H0 Perf .................................................................................................21
Alternate Hypothesis H1 Perf..........................................................................................22
Hypothesis II......................................................................................................................22
Null Hypothesis H0 Sec...................................................................................................22
Alternate Hypothesis H1 Sec...........................................................................................22
Hypothesis III....................................................................................................................22
Null Hypothesis H0 Usab................................................................................................22
Alternate Hypothesis H1 Usab.........................................................................................22
Mathematical Description of Hypotheses (Revised)...............................................................23
Web Metrics Framework .........................................................................................................23
Formula for Calculating Page Performance......................................................................27

Page 2 of 50
Case Study
Phase III
Research Results and Data Analysis........................................................................................27
Discussion of Findings with Statistical Analysis.....................................................................40
Conclusion...............................................................................................................................46
Future Work.............................................................................................................................47
References (Revised)...............................................................................................................48
[4]. A Quality Framework For Web Site Quality ..............................................................48
[5] R.S. Pressman & Associates Inc..................................................................................48

Page 3 of 50
Case Study
Phase III

Objectives of Study (Revised)


The objective of our study is to use web metrics for Web Applications to improve
overall Quality. We will be studying previous and current trends for web applications
related to quality and hence Web metrics can be used to improve the Quality of
CRM Applications.

Within a short period, the Internet and World Wide Web have become ubiquitous,
surpassing all other technological developments in our history. They have also
grown rapidly in their scope and extent of use, significantly affecting all aspects of
our lives. Industries such as manufacturing, travel and hospitality, banking,
education, and government are Web-enabled to improve and enhance their
operations. E-commerce has expanded quickly, cutting across national boundaries.
Even traditional legacy information and database systems have migrated to the
Web[1]. Advances in wireless technologies and Web-enabled appliances are
triggering a new wave of mobile Web applications. As a result, we increasingly
depend on a range of Web applications. Now that many of us rely on Web based
systems and applications, they need to be reliable and perform well. To build these
systems and applications, Web developers need a sound methodology, a
disciplined and repeatable process, better development tools, and a set of good
guidelines. The emerging field of Web engineering fulfils these needs. It uses
scientific, engineering, and management principles and systematic approaches to
successfully develop, deploy, and maintain high-quality Web systems and
applications. It aims to bring the current chaos in Web based system development
under control, minimize risks, and enhance Web site maintainability and quality.

The objective of our study is to have a thorough understanding of web applications


their trends and technologies being used. When we talk about Software
Engineering the word Web Engineering comes into our minds. Web Engineering is
evolving every day. Our focus is to analyze current web applications critically, define
Quality Assurance Frame work for Web Applications, defining Web Metrics and
creating a relationship of metrics regarding Quality Assurance of Web Applications.

Page 4 of 50
Case Study
Phase III
The main objectives of this study are:

• To define a framework for quality assurance.

• To identify Web Metrics that will boost the quality of a product.

• To understand the failure causes for web application.

• To develop a model that ensures adherence to software product standards,


processes and procedures. To assure that standards and procedures are
established and are followed throughout the software acquisition life cycle.

• To capture standards that will directly affect a business function.

• To investigate the issues, and present a distinguishable model.

• To measure the subject adherence of web pages.

• To measure the content quality of web pages.

• To measure the popularity of web pages.

• To measure the linkage between web pages.

• To measure the similarity between web pages.

Page 5 of 50
Case Study
Phase III

Problem Statement (Revised)

Businesses these days are employing web as an integral part of their strategies to
make their products available worldwide. This led to an increase in the number of web
applications deployed for business expansion. With this increase, quality has become
the most overlooked aspect of web products. Due to global reach, a small compromise
in quality has serious implications on the reputation of the business. This demands
formulation of a standard framework to ensure the quality of online applications.

We intend to study and analyze the Quality Assurance related issues with regard to web
as quality is always not given a too much importance in the whole SDLC. But now
industry is grooming and understanding of Quality Assurance throughout the life cycle of
a product under construction has improved a lot. When we talk about Web Quality, Web
Metrics come into play. Web Metrics and Quality assurance are closely related to each
other. In our problem statement we will be defining a framework for Quality of web
Applications. Metrics, as we know, refer to standards of measurement. Therefore, web
metrics are standardized ways of measuring something that relates to the Web.

Here in our case study we are focusing on Web Applications and to more precise we
will be using CRM Applications and mapping of these metrics to evaluate Quality.

We have decided to get main data from a renowned Software House Mindshare
Solutions as it specializes in CRM applications. The problem is that they have
introduced three demo versions of their CRM application named as Unify CRM. But
results reported are that it bears a low quality as for as overall performance of the
product is concerned. We decided to have a comprehensive comparisons of Unify CRM
with other renowned open Source CRMs and evaluate it on the basis of our Web
Metrics Frame work in which we will be focusing more on the Performance, Usability
and Security and see the Quality of the products in association with these Quality
metrics for Web Applications.

When considering the Web, it becomes clear that there is an abundance of different
things we can measure. As an example, consider web traffic, while it certainly is

Page 6 of 50
Case Study
Phase III
possible to review the entire Web and how "busy" it is. [2], consider web page similarity
such as executed by Google and other search engines if not performed with a wide
(meaning we only consider more than a small amount of pages, such as those of a
particular site), its practical usefulness is likely to be limited. In other words different
metrics can be applied on different views.

Problem Elaboration (Revised)


In our case study we are trying to frame a web metric for Quality Assurance for web
applications. For this purpose we have devised a Web metric Frame work for Quality
Assurance. To conduct our studies on the basis of empirical data we visited one of
the most leading Software Houses in industry. The name of the Software House is
Mindshare. It has the specialization in the CRM applications. It produces 3 different
levels of CRMs. The problem is that produced three demos for their CRM but they
failed badly. As suggested that we will be conducting an empirical analysis on how to
improve Quality of Web Applications based on Web Metrics Framework devised. The
organization wanted to know how to improve the Quality of their web products
(CRM).
To make our studies more useful and to conduct the analysis we will be choosing
four projects and will be comparing how to improve the Quality of Web applications
especially related to Web Applications on the basis of our Web Applications and
examine how Quality is closely related to these Web Metrics. We will then compare
our results on the basis Web metrics to how much extent they are improving the
Quality of the products more efficiently.
Quality Assurance has been a debatable topic in the Web development industry.
There are views which say that quality is only a value added service, not an integral
part of the software delivered. Another approach is that Quality can be assured if the
cost and time permits. As mostly the schedules are tight so all the saving is done on
Quality Assurance phase. However, most of the sophisticated IT Companies realize
the importance of the Quality Assurance and therefore have evolved whole
department for the very purpose. Quality Assurance has different dimensions which
vary according to the nature of the projects. For example a Web based application

Page 7 of 50
Case Study
Phase III
must incorporate proper performance, security, usability and other measures and
should perform efficiently. Similarly critical systems need to be quality assured
against breakdown and should have an efficient backup flows.
We will be defining a framework for Quality of web Applications. Metrics refer to
standards of measurement. Therefore, web metrics are standardized ways of measuring
something that relates to the Web Applications Quality.

Web metrics do play a very important role in determining the actual characteristics
related to the web applications functionalities. Here in our study we have identified
key web metrics that actually effect the Quality of Web Applications.
To support our study in a more specific way we decided to select few of the Web
metrics and to check them on Web Applications to see the impact of these on
Quality Assurance. Being more realistic we chose different CRM Applications to
check the impact of these metrics related to Quality Assurance.
Following metrics will be used in order to determine the extent to which CRM
Applications adhere to Quality:
- Performance

- Security

- Usability

Although at our initial study phase we had identified few other metrics as well but we
considered above mentioned metrics to be more precise to our case study.

Scope
This study encompasses key web QA metrics that can be used to ensure the quality of
Web Applications and specifically to CRM Applications. Following are the important key
areas:

1- Web Traffic

Page 8 of 50
Case Study
Phase III
Web traffic can be about number of hits a site has during any given interval, but
can also be measured in the amount of data that is transferred during a similar
interval. It can be useful in determining popularity, and through trend analysis also
in estimating future needs.

2- Web Page Significance


Significance can be seen as a formalization of quality and relevance of web
pages with regard to their information content. (Here, quality does indeed refer to
information content, and not to design quality, as discussed above).

3- Accessibility
Accessible design principles often results in a significant overall increase in
the usability of a Web site - in terms of faster completion of tasks, with lower
error rates, and more effective retention of knowledge of the site by repeat
users.

4- Relevance
Relevance is a direct measure of how well a particular page satisfies the
information need of some user, typically expressed as a set of query words.
As the Web grows exponentially, it seems logical to assume that the number
of documents that contain the same query words is typically also increasing.
The need for metrics that can order all such documents so that those that are
most relevant can be examined first is thus greater than ever.

5- Web Page Similarity


Web search engines such as Google have for some time, in addition to
relevance, allowed users to retrieve similar pages. There are three ways on
which similarity is typically measured content-based, link-based and usage-
based similarity.

Page 9 of 50
Case Study
Phase III

6- Search Engine Optimization


Website visibility and its ranking in the search engines can be very important
in the web-commerce industry. A website optimized for search engine can
yield more profit.

7- Performance
A fast website will increase the user experience very much and this brings
returning visitors.

8- Security
Online security is perhaps overlooked most often in local software houses,
websites with poor security implementations will invariably damage users and
the business.

9- Ease of Use
Quality issues regarding the ease of use of a web application are important in
that they help a business to retain their clientele. Also, such applications are
easier to maintain and change.

10- Portability
With a growing range of computer hardware and software platforms, it is
important for ecommerce applications to be able to perform consistently and
provide similar functionality in different computing environments.

11- Reliability
As with traditional software, reliability is always an important quality issue for
users. A system application should always produce consistent results and
outputs for a given fixed input. Otherwise the application can not be trusted
for high quality service.

Page 10 of 50
Case Study
Phase III
Prior Literature (Revised)
Quality Assurance is an important step in the website development process and, by
all means, should not be skipped. A broken link or a misspelled word may seem like
trivial mistakes, but they can greatly undermine the credibility of your website. You
want people who visit your site to feel assured of the quality of the information they
find.
As an emerging discipline, Web engineering actively promotes systematic,
disciplined and quantifiable approaches towards successful development of high-
quality, ubiquitously usable Web-based systems and applications [3]
A simple definition in the context of quality for Web sites is that ‘quality is meeting
requirements’. This definition works because by creating technical specifications and
requirements that describe various attributes of a Web site as well as how it should
function you have set yourself goals to achieve and have determined specific
indicators of quality. Quality can then be measured by testing various aspects of
your Web site and the complex relationships between all areas of the site at
intervals.
The current World Wide Web has many flaws, with a great many resources failing to
comply with. As we move towards a richer, more structure Web, it will be essential
that quality assurance is built into development processes – unlike HTML, XML
applications formally require string adherence with the standards and may fail to
render if this is not the case. However, even when a resource does comply with
standards it does not mean that the user experience will necessarily be a happy one.
Thus, a combination of supplier QA and user satisfaction assessment are needed.
However, linking the subjective perceptions of users with the QA practices of
suppliers is not a simple task. The next stage of work is to model the relationships
between user satisfaction and supplier initiatives (such as QA procedures). One way
in which this might be done is through quality function deployment (QFD): “a
structured and disciplined process that provides a means to identify and carry the
voice of the customer through each stage of product and or service development
and implementation” [4].

Page 11 of 50
Case Study
Phase III
Software process and product metrics are quantitative measures that enable
software people to gain insight into the efficacy of the software process and the
projects that are conducted using the process as a framework. Basic quality and
productivity data are collected. These data are then analyzed, compared against
past averages, and assessed to determine whether quality and productivity
improvements have occurred. [5]
The Internet and the world wide web (WWW or simply the web) are some specific
example of general heterogeneous systems. QA for these systems is gaining
importance.[6]
Web applications have become very complex and crucial, especially when combined
with areas such as CRM (Customer Relationship Management) and BPR (Business
Process Reengineering). The scientific community has focused attention to Web
applications design, development, analysis, and testing, by studying and proposing
methodologies and tools [7]
Given the organic growth of the Web, we require new metrics that provide deeper
insight on the Web as a whole and also on individual sites from different
perspectives. Arguably, the most important motivation for deriving such metrics is the
role they can play in improving the quality of information available on the Web. [8]
When you can measure what you are speaking about, and express it in numbers,
you know something about it; but when you cannot express it in numbers, your
knowledge is of a meager and unsatisfactory kind; it may be the beginning of
knowledge, but you have scarcely in your thoughts advanced to the state of
science.[9]
Metrics help organizations generate more effective Web sites and provide measures
that understand and that academics can replicate and analyze. To provide practical
value, metrics should identify frequency of measurement, frequency of review,
source of data, rationale for introducing the measure, who will act on the data, and
the purpose of the measure (Neely 1998). For scientific, quantitative rigor, metrics
should exhibit, at a minimum, construct validity and
reliability (Straub 1989, Cook and Campbell 1979).[10]

Page 12 of 50
Case Study
Phase III
For effective measurement, the measurement activity should have clear objectives
and identify types of attributes that can be measured and appropriate scales.
Empirical relations for an ttribute should be identified in advance and the
measurement scale must be meaningful. [10]

Research Methodology (Revised)


We worked with a software house, Mindshare Solutions, to find quality issues in their
CRM system “UnifyCRM”, which they sell as a product to different international
clients. We used our web metrics framework to assess its level of quality. Same
quality metrics framework was run on the CRM systems of the company’s
competitors. The results generated by our framework were used to find out the
differences in quality levels between Unify CRM and competitors’ CRMs.

We were asked to evaluate the quality of CRMs by focusing on three parameters:


1) Performance
2) Security
3) Usability
These metrics will be elaborated in detail in the coming sessions for understandings.
To develop a detailed metrics framework for the above mentioned parameters of
quality, we explored the existing web engineering practices, techniques and core
web metrics used in web development industry. Our research methodology tried to
identify the successful and problematic areas in the existing approaches used during
the development life cycle of web-based systems. We also identified different
aspects that are being used in local industry to evaluate the quality of a website. In
the case study we define, how to measure these aspects and how to improve and
how these metrics can be used to improve the quality of Web Applications,
especially CRMs.

Now we describe the steps we used to formulate the structured research


methodology for the case study. For our case study, we used some real world CRM

Page 13 of 50
Case Study
Phase III
projects. Our main aim is to identify a set of key quality aspects and then formulate a
workable framework for the quality metrics thus identified. Our research
methodology is a systematic process to achieve the objectives of carrying out this
study. The methodology that was formulated for case study is briefly explained.

Develop Web Metrics Framework

Detailed metrics of performance, security and usability were developed to help us


measure the CRMs from different quality aspects. This detailed framework is
explained in the later section.

Selection of CRMs
We have selected certain Customer Relationship Management (CRM) systems with
the focus on QA metrics. Our selection consists of a mix of open-source and
proprietary CRMs. We have selected four systems:

1. Unify CRM a product of Mindshare Solutions Pvt. Ltd.


(http://unifycrm.com/UnifyCRM/Login.aspx)

2. Salesforce CRM

(http://www.salesforce.com/)

3. Enterprise CRM and Groupware System


(http://sourceforge.net/projects/egs/)

4. SugarCRM, an open-source CRM (

http://www.sugarcrm.com/crm/)

Salesforce CRM

Page 14 of 50
Case Study
Phase III
The proven leader in on-demand customer relationship management (CRM),
salesforce.com empowers customers to stand out from the crowd. We do so by
delivering the most innovative technology and making it as easy as possible to share
and manage business information. Our solutions combine award-winning
functionality, proven integration, point-and-click customization, global capabilities,
and the best user experience and the result is CRM success. That's why Salesforce
has earned the trust of its customers and a customer success rate of
95%.Salesforce SFA enables companies to drive sales productivity, increase
visibility, and expand revenues with an affordable, easy-to-deploy service that
delivers success to companies of all sizes.

Following are key features of Salesforce CRM:-

Service & Support

The Salesforce solution for customer service gets companies up and running in a
matter of weeks with a call center application that is loved by agents and a customer
self-service application— powered by Web 2.0—that generates new levels of
customer loyalty.

Partner Relationship Management

Salesforce Partners makes it easy for partners to access leads, collaborate on


deals, and locate all the information they need in order to be successful. The
Salesforce Partners is seamlessly integrated with Salesforce SFA to deliver
unparalleled visibility to your company's entire sales pipeline for direct and indirect
channels.

Marketing

Page 15 of 50
Case Study
Phase III
Salesforce Marketing enables closed-loop marketing to execute, manage, and
analyze the results of multichannel campaigns. Marketing executives can measure
the ROI of their budgets, tie revenue back to specific marketing programs, and make
adjustments in real time.

Content

Salesforce Content brings Web 2.0 usability to your business content so you can
share it more effectively and enhance collaboration within your organization.
Empower employees to find the exact documents they need, right from the business
applications they use on a daily basis.

Analytics

Salesforce Analytics empowers business users at every level to gain relevant insight
and analysis. With real-time reporting, calculations, and dashboards, businesses can
optimize performance, decision making, and resource allocation.

Custom Applications

Build enterprise-class applications on salesforce.com's powerful on-demand


platform. Deliver all your company's business applications in a single environment
with one data model, one sharing model, and one user interface.

Page 16 of 50
Case Study
Phase III
Industry Applications

Meet all of your industry-specific needs with salesforce.com's award-winning CRM, a


broad variety of on-demand apps from the AppExchange, and the Force.com
platform. Our industry applications are built on the successes of hundreds of
companies in your industry. And because no two companies are exactly alike, all
industry apps are fully and easily customizable.

AppExchange Applications

The AppExchange is your one-stop marketplace for on-demand business


applications. The AppExchange makes it easy to find, sample, and select from
hundreds of apps for your business, all preintegrated with Salesforce.

SOURCE FORGE CRM KEY FEATRUES


Sourceforge CRM application is a highly flexible Sales Force Automation (SFA) tool
that meets both the needs of sales managers and the sales rep.
In addition to standard SFA functionality such as lead, account and opportunity
management, Sales Management provides a powerful sales management system to
improve a sales organization's productivity, allowing management to plan ahead of
economic changes in order to effectively manage any market condition.

It provides sales reps the capability to develop accurate forecasts, seamlessly share
information across sales teams, and configure products and services to meet the
unique needs of each customer. Its simple user interface is designed to improve
sales rep productivity, yet support best practices across the entire sales
organization. Sales Management's key features and capabilities include:
Improve forecast accuracy - The sales pipeline is continually updated in real time so
that everyone in your organization is provided with a clear view, allowing resources
to be focused accordingly.

Page 17 of 50
Case Study
Phase III
Close more deals - Sales process visibility enables each member of your sales team
to know precisely what the other teammates are working on, allowing them to
collaborate to transform prospects into profitable customers.
Shorten and standardize unique sales cycles - Because different channels require
different sales processes, Sales Manager allows an unlimited number of unique
sales methodologies or sales process to be created. Teams, both inside and outside
your organization can effectively work together to close accounts by scheduling
events, assigning tasks, coordinating meetings, flagging new opportunities, and
updating client files on every account.
Enable collaborative and consistent customer management - Real-time, secure
access to detailed account data enables you and your channel partners to
collaborate with sales, customer service & support, and marketing personnel. With
instant access to all communication, including email, notes, calls, resolutions, and
more, you can collectively manage customer relationships across your entire
extended enterprise.
Recognize "big picture" market trends - With Sourceforge flexible reporting system,
it's easy to review and analyze sales data, both current and historical, allowing sales
management to spot changes in customer behavior or shifts in key market
indicators. Armed with a comprehensive contextual view of both past and current
events, your sales organization can respond to evolving customer needs and
economic condition

Project Admins : jstride, nsuk


Developers : 6
Database Environment : ADOdb,PostgreSQL (pgsql)
Development Status : 5 - Production/Stable
License : GNU General Public License (GPL) Operating
System : OS Independent (Written in an interpreted
language)
Programming Language : PHP
Translations : English

Page 18 of 50
Case Study
Phase III
User Interface : Web-based
Project UNIX name : egs
Registered : 2003-05-22 06:18

Sugar 5.0 Key Features


SugarCRM is becoming a disruptive force in the small enterprise Customer
Relationship Management (CRM) market. Its commercial open source model, CRM
appliance option, low price, and strong set of CRM features are impacting more
traditional methods of CRM delivery and the perception of CRM value among small
enterprises. Despite its small size compared to other vendors, SugarCRM scored
highest in the Product Index of our evaluation, due to its broad features and
extremely flexible deployment option.
New Module Builder allows users to build custom modules from scratch or combine
existing or custom objects into a brand new CRM module.
New Metadata Driven User Interface (UI) stores customizations in a metadata
repository and combines the benefits of custom CRM with the ability to incorporate
new features in future releases.
Improved Access Control offers better support for team hierarchies and access
control functions that manage and protect information at the field level.
New AJAX Email Client delivers the functionality of a desktop email client with the
portability of a web-based email application.
Improved Dashboards with new charting capabilities, including support for funnel,
pie charts, line and bar graphs, and performance gauges.
Multiple Dashboards allows users to access any number of pre-built or custom
dashboards from their homepage.

Data Collection and Analysis

All CRMs will be passed through metrics framework and data will be collected
against different parameters of quality. Data would be entered into statistical
software. We will use SPSS 15.0 for statistical analysis of our data. We intend to find

Page 19 of 50
Case Study
Phase III
out if there is a significant difference in the quality of the selected CRMs in terms of
performance, security and usability. It can be accomplished with the application of a
statistical test. As the total number of CRMs being used are four(4), we will use
ANOVA test. [11]

Organization Information
53
Size
Expertise Enterprise Solutions Provider

Project Information
4
No. of Projects
under Study
Domain of Projects Customer Relationship
Management System
Communication Project Manager of corresponding
Person Organization’s Quality Team.
Table – 1

Interpretation of Data
After the statistical analysis of the data, we will describe the results of draw
inferences and recommend a course of action about the quality areas where Unify
CRM needs improvements.

Research Report for Findings


We will be making a comprehensive report to conclude our study and present a web
metric framework that can be employed as a tool to assess the quality of any CRM.

Pictorial Representation of Research Methodology (REVISED)

Page 20 of 50
Case Study
Phase III

Fig – 1 (Representation of Research Methodology)

Development of Hypotheses (Revised)

Hypotheses:
For the purpose of analytical study, we have assumed following hypothesis

Hypothesis I
This hypothesis enables us to evaluate the impact of our framework’s performance
metrics on the quality of a web application.

Null Hypothesis H0 Perf


Performance metrics in our framework have no effect on the quality of a web
application.

Page 21 of 50
Case Study
Phase III
Alternate Hypothesis H1 Perf
Performance metrics in our framework have a significant effect on the quality
of a web application.

Hypothesis II
Through this hypothesis we can evaluate the impact of our framework’s security
metrics on the quality of a web application.

Null Hypothesis H0 Sec


Security metrics in our framework have no effect on the quality of a web
application.

Alternate Hypothesis H1 Sec


Security metrics in our framework have a significant effect on the quality of a
web application.

Hypothesis III
This hypothesis helps us evaluate the impact of our framework’s usability metrics on
the quality of a web application.

Null Hypothesis H0 Usab


Usability metrics in our framework have no effect on the quality of a web
application.

Alternate Hypothesis H1 Usab


Usability metrics in our framework have a significant effect on the quality of a
web application.

Page 22 of 50
Case Study
Phase III
Mathematical Description of Hypotheses (Revised)
For this study, we have selected the value of significance as α = 0.05. This is the
probability of rejecting the Null hypothesis, when the Null hypothesis is true i.e. there
are 5 in 100 chances that the Null hypothesis is rejected

Hypotheses that were formulated with the hope that they be rejected led to the use
of the term null hypothesis. Today this term is applied to any hypothesis we wish to
test and is denoted by Ho. The rejection of Ho leads to the acceptance of an
alternative hypothesis, denoted by H1.

In the hypotheses, the performance, security and usability are the independent
variables, and quality is the dependent variable. This is because quality depends on
metrics in the framework.

Web Metrics Framework

Security Metrics

Following are the security parameters to evaluate quality of the CRM application
that to how much extent security is affecting the overall Quality of the proudct.

1. SQL injection
2. Cross-Site Scripting Attacks
3. Session Hijacking
4. Denial of Service
5. Buffer Overflows

SQL injection:
SQL injection is a type of security exploit in which the attacker adds Structured
Query Language (SQL) code to a Web form input box to gain access to resources or

Page 23 of 50
Case Study
Phase III
make changes to data. An SQL query is a request for some action to be performed
on a database. [12]

Cross-Site Scripting Attacks

Cross-site scripting (XSS) is a security exploit in which the attacker inserts malicious
coding into a link that appears to be from a trustworthy source. When someone
clicks on the link, the embedded programming is submitted as part of the client's
Web request and can execute on the user's computer, typically allowing the attacker
to steal information.

Denial of Service

A denial of service (DoS) attack is an incident in which a user or organization is


deprived of the services of a resource they would normally expect to have. In a
distributed denial-of-service, large numbers of compromised systems (sometimes
called a botnet) attack a single target.

Common forms of denial of service attacks are:

Services Detail
SYN Attack When a session is initiated between the Transport Control
Program (TCP) client and server in a network, a very small
buffer space exists to handle the usually rapid "hand-shaking"
exchange of messages that sets up the session.
Teardrop This type of denial of service attack exploits the way that the
Attack Internet Protocol (IP) requires a packet that is too large for the
next router to handle be divided into fragments.

Smurf Attack In this attack, the perpetrator sends an IP ping (or "echo my
message back to me") request to a receiving site The ping
packet specifies that it be broadcast to a number of hosts within
the receiving site's local network.
Viruses Computer viruses, which replicate across a network in various
ways, can be viewed as denial-of-service attacks where the

Page 24 of 50
Case Study
Phase III
victim is not usually specifically targetted but simply a host
unlucky enough to get the virus.
Physical Here, someone may simply snip a fiber optic cable. This kind of
Infrastructure attack is usually mitigated by the fact that traffic can sometimes
Attacks quickly be rerouted.
Table – 2 (Security Metric Measures)

Buffer Overflows

A buffer overflow occurs when a program or process tries to store more data in a
buffer (temporary data storage area) than it was intended to hold. Since buffers are
created to contain a finite amount of data, the extra information - which has to go
somewhere - can overflow into adjacent buffers, corrupting or overwriting the valid
data held in them. Although it may occur accidentally through programming error,
buffer overflow is an increasingly common type of security attack on data integrity.

Session Hijacking

Session hijacking, also known as TCP session hijacking, is a method of taking over
a Web user session by surreptitiously obtaining the session ID and masquerading as
the authorized user. Once the user's session ID has been accessed (through
session prediction), the attacker can masquerade as that user and do anything the
user is authorized to do on the network.

Usability Metrics

Here we discuss the aspects / parameters to measure the usability of CRM


applications to see to how much extent they make an impact on the quality of the
product.

Page 25 of 50
Case Study
Phase III
To determine if a system meets user/human-centered design goals of effectiveness,
efficiency, and user satisfaction, it is necessary to collect objective, quantifiable
data. The Human Factors Specialist develops experimental plans and data
collection procedures to obtain and statistically analyze measures of usability. Some
of the most frequently used metrics for this purpose are listed by goal as follows: [13]

• Effectiveness
o Training time
o Time to reach proficiency
o Number of commands/actions per task
o Number of commands/features that are never used
o Number of times "help" is accessed

• Efficiency
o Time to complete a task
o Error rate
o Number of tasks completed within a given time
o Error recovery time
o Decision time/delay

• User satisfaction
o Positive statements recorded during observations
o Negative statements recorded during observations

Performance Metrics
Following are the parameters to measure the performance of a CRM application to
evaluate the Quality of the product.[14]

Parameters Detail
Simultaneous browser Number of users using the simultaneous connections

Page 26 of 50
Case Study
Phase III
connections to the server.
Warm up time (secs) The time required to open the first page, first time are
called warp up time.
Total number of requests Total Number of requests sent during one iteration.
Total number of Total Number of active connections to the server
connections during one iteration.
Average requests per Average request in one second during one iteration.
second
Average time to first byte Average time required to download first byte in one
(msecs) millisecond, after sending request.
Average time to last byte Average time required to download last byte in one
(msecs) millisecond, after sending request.
Number of bytes sent Number of bytes sent in one iteration.
(bytes)
Http Errors Numeric value for the http error.
Table – 3 (Performance Metric Measures)

Formula for Calculating Page Performance

The following formula will quantify the performance of a Web application by


measuring the number of processor cycles needed for each request. The formula
requires dividing the amount of cycles spent by the number of requests that were
handled.

(processor speed x processor use)


__________________________________ = cost, in cycles/request (or Hz/rps)
number of requests per second

Research Results and Data Analysis

We took an approach in which statistical analysis will be covered in Discussion section of


the study where we will be discussing Statistical Test ANOVA and its results in terms of
the Significance on the Quality of the CRM applications using our devised Quality Frame
work of metrics. In order to carry out our study in a more successful way we did various

Page 27 of 50
Case Study
Phase III
testing for the metrics defined in our frame work for the four (4) applications or CRM. For
performance, security and usability various testes were conducting using different testing
tools in order to see the results in more proper way. On the basis of these tests
conducted one can foresee the Quality of the CRM applications in terms of Web Metrics
formulated.

Following tests were conducted for our analysis of the test.

PERFORMANCE TESTING

SECURITY TESTING

USABILITY TESING

For these tests to be conducted various testing techniques were used. Now we briefly
focus on these test results and show them in tabular form.

Performance Testing

We did conduct performance testing on the four applications using MICROSOFT


APPLICATION TEST CENTER. Application test center is designed to focus on
stress testing of Web servers and analyze performance and scalability problems with
Web applications, including Active Server Pages (ASP) and the components they
use. Application Center Test simulates a large group of users by opening multiple
connections to the server and rapidly sending HTTP requests. Application test center
supports several different authentication schemes and the SSL protocol, making it
ideal for testing personalized and secure sites.

Although long-duration and high-load stress testing is Application Center Test's main
purpose, the programmable dynamic tests will also be useful for functional
testing.Application Center Test is compatible with all Web servers and Web
applications that adhere to the HTTP protocol.

Page 28 of 50
Case Study
Phase III
To check the quality of the CRM applications we did stress testing as it is supported
in Test center. Stress testing is a form of testing that is used to determine the
stability of a given system or entity. It involves testing beyond normal operational
capacity, often to a breaking point, in order to observe the results. Stress testing may
have a more specific meaning in certain industries.[16]

As we did use Microsoft Application Test center for conducting our tests so here we
discuss some of the features of test center. It uses the features of dynamic testing
techniques. Dynamic tests consist of a script that sends requests to the Web server
during a test run. These tests are called "dynamic" because the request order, the
URL being requested, and many other properties, are determined at the time the test
runs. The code in the test can examine the server's previous response before
creating the properties of the next request. Because most of the request's behavior
is under the control of the script, an understanding of the HTTP protocol is
necessary. If your test will be emulating the behavior of a particular HTTP user
agent, particularly something other than a common Web browser running on a
personal computer, you will need to understand how that user agent behaves.

Note that dynamic tests must run within the Application Center Test UI. This is
required because the program, not the test script, is responsible for tracking and
managing the request properties, such as the HTTP headers and user cookies.

Here we used five categories with respect to concurrent users with the range of
20,40,60,80,100 respectively. We did create lots of tests for performance metric to
assess the quality of the product. We also considered other aspects related to
Quality of the product like Dot Net frame work but as our CRMs had different
platforms so this could not be extended.

Tabular Representation of the tested CRM in terms of Performance

All the results have been shown in tabular form show exact measures of the each
parameter used for the performance.

Page 29 of 50
Case Study
Phase III
These are the performance Tables to check the Quality of 4 CRMs.

PERFORMANCE TEST-1

Unify Sales Sugar Enterprise


Performance CRM Force CRM CRM
Simultaneous browser
connections 20 20 20 20
Warm up time (secs)
1.45 0.5 0.5 0.9
Total number of requests
9800 13625 10200 116932
Total number of connections 1315
5 12727 10900 12689
Average requests per second
66 29.5 35 42.9
Average time to first byte (msecs)
1.5 0.6 1.325 0.92
Average time to last byte (msecs)
5.32 2.43 3.5 4.33
Number of bytes sent/Sec (bytes) 7E+0 976065 866632
6 7 5 6738890
HTTP Errors
1 0 0 0
Table – 4 (Performance of CRMs for 20 concurrent users)

PERFORMANCE TEST-2

Unify Sales Sugar Enterprise


Performance CRM Force CRM CRM
Simultaneous browser
connections
40 40 40 40
Warm up time (secs)

1.5 0.5 0.7 0.75

Page 30 of 50
Case Study
Phase III
Total number of requests

19571 25255 20317 23843


Total number of connections

26133 25255 21995 25578


Average requests per second

63.57 30.05 37.03 47.11


Average time to first byte
(msecs)
1.28 0.85 1.03 1.11
Average time to last byte
(msecs)
5.32 2.43 3.5 4.33
Number of bytes sent/Sec
(bytes) 655084 976065 866632
3 7 5 7436890
HTTP Errors

5 0 0 2
Table – 5 (Performance of CRMs for 40 concurrent users)

PERFORMANCE TEST-3

Unify Sales Sugar Enterpri


Performance CRM Force CRM se CRM
Simultaneous browser
connections

60 60 60 60

Page 31 of 50
Case Study
Phase III
Warm up time (secs)

1.5 0.72 0.79 0.8


Total number of requests

29443 20265 22425 28342


Total number of connections

39465 32721 35432 33825


Average requests per second

63.57 30.05 37.03 47.11


Average time to first byte
(msecs)
1.28 0.85 1.03 1.11
Average time to last byte
(msecs)
5.32 2.43 3.5 4.33
Number of bytes sent/Sec
(bytes) 655084
3 9760657 8666325 7436890
HTTP Errors

10 4 2 7
Table – 6 (Performance of CRMs for 60 concurrent users)

PERFORMANCE TEST-4

Unify Sales Sugar Enterpris


Performance CRM Force CRM e CRM
Simultaneous browser
connections
80 80 80 80

Page 32 of 50
Case Study
Phase III
Warm up time (secs)

1.45 0.5 0.5 0.9


Total number of requests

38980 30453 36522 37675


Total number of connections

54620 40822 45987 50465


Average requests per second

66 29.5 35 42.9
Average time to first byte (msecs)

1.5 0.6 1.325 0.92


Average time to last byte (msecs)

5.32 2.43 3.5 4.33


Number of bytes sent/Sec (bytes)
685485 976065 866632
3 7 5 6738890
HTTP Errors

22 9 14 17
Table – 7 (Performance of CRMs for 80 concurrent users)

PERFORMANCE TEST-5

Unify Sales Sugar Enterpri


Performance CRM Force CRM se CRM

Page 33 of 50
Case Study
Phase III
Simultaneous browser
connections

100 100 100 100


Warm up time (secs)

1.5 0.8 0.9 1.3


Total number of requests

49825 37568 40918 48431


Total number of connections

65775 12727 10900 67293


Average requests per second

66 29.5 35 42.9
Average time to first byte
(msecs)
1.5 0.6 1.325 0.92
Average time to last byte
(msecs)
5.32 2.43 3.5 4.33
Number of bytes sent/Sec
685485 976065 866632
(bytes)
3 7 5 6738890
HTTP Errors

52 18 25 47
Table – 8 (Performance of CRMs for 100 concurrent users)

PERFORMANCE TEST-6

Page 34 of 50
Case Study
Phase III

Suga
Unify Sales r Enterpri Standard
Performance CRM Force CRM se CRM Value
Warm up time (secs)

72.5 25 25 45 2
Total number of requests

65.3 90.8 68 77.9 15000


Total number of connections

87.7 84.8 72.6 84.5 15000


Average requests per second

66 29.5 35 42.9 100


Average time to first byte
(msecs)
75 30 66.2 46 2
Average time to last byte
(msecs)
53.2 24.3 35 43.3 10
Number of bytes sent/Sec
(bytes)
68.5 97.6 86.6 67.3 10,000,000
HTTP Errors

52 18 25 47 100
Table – 9 (Performance Test with random No. of Users)

Graphical Representation of performance measures

Page 35 of 50
Case Study
Phase III
As performance parameters have different measuring units. They are in milli
seconds to seconds from bits to bytes so in this order is quite hectic to present
them in such a way. For this reason we assigned all parameters a percentile of
hundred according to which their measures have been shown. We have used
specific standardized values for all the CRM applications to show the accurate
results on the graph.

100%

80%
Enterprise CRM
60% Sugar CRM

40% Sales Force


Unify CRM
20%

0%
Number of
number of

time to first
time (secs)
Warm up

Average
Total

bytes

Fig – 2 (Graphical Representation of Performance Measures)

Security Testing

To conduct security testing we used IBM Rational AppScan Standard Edition is an


industry-leading Web application security testing suite that scans and tests for all
common web application vulnerabilities - including those identified in the WASC
threat classification - such as SQL-Injection, Cross-Site Scripting and Buffer
Overflow.

Page 36 of 50
Case Study
Phase III
• Provides broad application coverage, including Web 2.0/Ajax applications
• Generates advanced remediation capabilities including a comprehensive task list to
ease vulnerability remediation
• Simplifies security testing for non-security professionals by building scanning
intelligence directly into the application
• Features over 40 out-of-the-box compliance reports including PCI Data Security
Standards, ISO 17799, ISO 27001, Basel II, SB 1386 and PABP (Payment
Application Best Practices [17]

It scans and tests for all common Web application containing vulnerabilities including
SQL-Injection, Cross-Site Scripting and Buffer Overflow.

Tabular Representation of Security Measures in terms of Quality

Here we present a tabulated representation of our security metric parameters.

Unify Sales Sugar Enterprise


Security Parameters CRM Force CRM CRM

SQL injection 15 0 2 7

Cross-Site Scripting Attacks 42 5 7

Session Hijacking 3 0 0 1

Denial of Service 50 38 42 43

Buffer Overflows 87 35 59 58
Table – 10 (Representation of Security Parameters)

Page 37 of 50
Case Study
Phase III
Graphical Representation of Security Measures in terms of Quality

100
90
80
70 Unify CRM
60
Sales Force
50
40 Sugar CRM
30 Enterprise CRM
20
10
0
Hijacking
injection

Overflows
Session

Buffer
SQL

Fig – 3 (Graphical Representation of Security Measures)

Usability Testing

We did conduct usability testing in terms of Effectiveness, Efficiency and User


Satisfaction. Usability is defined by five quality components:

• Learnability: How easy is it for users to accomplish basic tasks the first time
they encounter the design?
• Efficiency: Once users have learned the design, how quickly can they
perform tasks?
• Memorability: When users return to the design after a period of not using it,
how easily can they reestablish proficiency?
• Errors: How many errors do users make, how severe are these errors, and
how easily can they recover from the errors?
• Satisfaction: How pleasant is it to use the design?

Page 38 of 50
Case Study
Phase III
There are many other important quality attributes. A key one is utility, which refers
to the design's functionality: Does it do what users need? Usability and utility are
equally important: It matters little that something is easy if it's not what you want. It's
also no good if the system can hypothetically do what you want, but you can't make
it happen because the user interface is too difficult. To study a design's utility, you
can use the same user research methods that improve usability.

Usability Testing Tool

To conduct our usability testing we devised a usability Questionnaire which aided us


in determining usability of the CRM applications. Usability Questionnaire has already
been discussed in the prior section of Web Metrics

Tabular Representation of Usability Measures in terms of Quality

Sale
Unif s S
y Forc ugar Enterpri
Security Parameters CRM e CRM se CRM
Training time 80 70 60 65
Number of commands/actions per task 10 55 49 70
Number of commands/features that are never
used 75 62 53 10
Number of times "help" is accessed 89 64 47 20
Time to complete a task 10 59 50 64
Error rate 90 5 20 56
Number of tasks completed within a given time 95 70 50 62
Error recovery time 10 67 52 20
Table – 11 (Usability Metrics)

Graphical Representation of Usability Measures in terms of Quality

Page 39 of 50
Case Study
Phase III

100%

80%

60%
Enterprise CRM
40% Sugar CRM
Sales Force
20% Unify CRM

0%
Training time Time to complete a
task

Fig – 4 (Graphical Representation of Usability Measures)

Discussion of Findings with Statistical Analysis


Since we used four(4) different CRM applications to verify our metrics framework, ANNOVA
Statistical Analysis Test was undertaken to test all hypotheses.

Hypothesis 1
From below table we can see that the Sig. (significant) value against all performance
metrics is less than 0.05 level of significance. This leads us to reject null hypothesis
H0 Perf and accept alternate hypothesis H1 Perf Thus, we can infer that the
performance metrics in the framework have significant impact on the quality of a web
application.

ANOVA - TEST
Sum of Mean
Squares df Square F Sig.

Page 40 of 50
Case Study
Phase III
SimultaneousBrows Between 1767.29
2 883.647 .047 .024
erConnections Groups 4
Within 887289
471 18838.419
Groups 5.206
Total 887466
473
2.500
WarmUpTime Between 647130
323565110
Groups 221571
2 78574660 25.886 .000
493000
00.000
0.000
Within 5874811
12499598
Groups 100031
470 08517442
970000
00.000
0.000
Total 652194
132160
472
347000
00.000
TotalRequests Between 1622.98
2 811.495 165.212 .000
Groups 9
Within 2313.47
471 4.912
Groups 7
Total 3936.46
473
6
TotalConnections Between 894384
44719241
Groups 83925.9 2 434.481 .02
962.971
43
Within 484780
10292571
Groups 11510.3 471
4.459
97
Total 137916
495436. 473
340

Page 41 of 50
Case Study
Phase III
AvgRequestsPerSe Between 179255
89627722
c Groups 44532.2 2 371.106 .005
66.149
97
Within 113753
24151508.
Groups 60433.1 471
351
56
Total 293009
04965.4 473
54
AvgTimeFirstByte Between
6.227 2 3.114 .031 .014
Groups
Within 47872.0
471 101.639
Groups 68
Total 47878.2
473
95
AvgTimeLastByte Between 117490 587453.43
2 69.192 .007
Groups 6.874 7
Within 399889
471 8490.233
Groups 9.936
Total 517380
473
6.810
NumberOfBytesSen Between
4.482 2 2.241 13.763 .003
t Groups
Within
76.699 471 .163
Groups
Total 81.181 473
HttpErrors Between
4.482 2 5.114 13.763 .011
Groups
Within
76.699 471 125.163
Groups
Total 81.181 473
Table - 12 First ANOVA Test

Hypothesis II

Page 42 of 50
Case Study
Phase III
We can clearly see below that the Sig. (significant) value against all security metrics
is less than 0.05 level of significance which means we can reject hypothesis H0 Sec
and accept alternate hypothesis H1 Sec Thus, it is inferred that the security metrics in
the framework have significant impact on the quality of a web application.

ANOVA - TEST

Sum of
Square Mean
s df Square F Sig.
SQLinjection Between
.604 2 .302 .377 .032
Groups
Within
15.214 19 .801
Groups
Total 15.818 21
CrossSiteScripting Between
.286 2 .143 .173 .004
Groups
Within
15.714 19 .827
Groups
Total 16.000 21
SessionHijacking Between
.073 2 .037 .138 .002
Groups
Within
5.018 19 .264
Groups
Total 5.091 21
DenialOfService Between
.006 2 .003 .014 .021
Groups
Within
4.357 19 .229
Groups
Total 4.364 21

Page 43 of 50
Case Study
Phase III
BufferOverflows Between
63.955 2 31.977 .752 .014
Groups
Within 807.50
19 42.500
Groups 0
Total 871.45
21
5
Table – 13 Second ANOVA Test

Hypothesis III
As from below table, it is clear that Sig. (significant) value against all usability metrics
is less than 0.05 level of significance which means we can reject hypothesis H0 Usab
and accept alternate hypothesis H1 Usab Thus, we can infer that the usability metrics
in the framework have significant impact on the quality of a web application.

ANOVA - TEST

Sum of Mean
Squares df Square F Sig.
TrainingTime Between
.286 2 .143 .173 .031
Groups
Within
15.714 19 .827
Groups
Total 16.000 21
PorifciencyTime Between
63.955 2 31.977 .752 .009
Groups
Within
807.500 19 42.500
Groups
Total 871.455 21

Page 44 of 50
Case Study
Phase III
ErrorRate Between
9.657 2 4.829 .684 .006
Groups
Within
134.161 19 7.061
Groups
Total 143.818 21
ErrorRecoveryTime Between
2.255 2 1.127 .931 .002
Groups
Within
23.018 19 1.211
Groups
Total 25.273 21
DecisionTime Between
16.578 2 8.289 1.578 .232
Groups
Within
99.786 19 5.252
Groups
Total 116.364 21
PositieStatements Between
3.448 2 1.724 .367 .000
Groups
Within
89.143 19 4.692
Groups
Total 92.591 21
NegativeStatements Between
4.294 2 2.147 .479 .004
Groups
Within
85.161 19 4.482
Groups
Total 89.455 21
ActionsPerTask Between
.604 2 .302 .377 .090
Groups
Within
15.214 19 .801
Groups
Total 15.818 21
FeaturesNeverUsed Between
.073 2 .037 .138 .000
Groups
Within
5.018 19 .264
Groups
Total 5.091 21

Page 45 of 50
Case Study
Phase III
TaskCompletionTim Between
.006 2 .003 .014 .002
e Groups
Within
4.357 19 .229
Groups
Total 4.364 21
Table – 14 Third ANOVA Test

Conclusion
Based on the study conducted for the web applications in general and CRM
applications in particular driven from the Web Metrics Framework , we have drawn
following conclusions about the improvement of CRM application for the organization
under study as compared to open source CRM applications.

1. It is quite evident that having more adequate measures about performance,


security and usability we have more progressive standard towards achieving
high Quality of the product.

2. Results show that in the presence of more empirical measures of


performance the overall quality of the product is increased the same is the
case with usability and security measures.

3. From our studies we come into conclude that Salesforce CRM is more
effective in terms of performance, security and usability. Second is the
Sourceforge CRM which also shows good results when these measures are
tested. Enterprise CRM is at third level in terms of overall Quality and forth
one is the Unify CRM from a local software House.

4. Quality Manager of the Software House has been provided with a set of
Metrics to improve their overall productivity at individual levels as well as at
team performance levels.

5. The metrics framework is not specific to only CRM applications this can also
be applied to other web based applications for improving the Quality.

Page 46 of 50
Case Study
Phase III
Based on the statistical analysis of data collected from the organization, we have
concluded that:

1. There is no documentation being done as for as Metrics which are related to


the Quality of the product.

2. It is now become clear that why their first three versions of Unify CRM had
been failed just because in the absence of proper measures.

3. If these measures are used at appropriate stages of deliverables the Quality


can be met at heights in order to remove the defects of the application.

4. The post-releases issues have no relationship with the software defects after
release of the software.

Future Work
In this study, we presented three hypotheses, which were related to Web metrics
frame work model. Next step is to apply the framework on the application, and
finally, we have to study the effect of proposed Web Metrics framework on the
software overall Quality. This test is the limitation in this study, and in our future
studies, we will provide the quality framework analysis on other applications. More
work can be done on the other metrics related to the Quality of web applications. We
intend to improve upon this frame work and test it on other web applications as well.

Page 47 of 50
Case Study
Phase III

References (Revised)
[1] Athula Ginige and San Murugesan, "Web Engineering: An Introduction," IEEE
Multimedia, Vol. 8, No. 1, January 2001, pp 14-18

[2] http://www.abo.fi/~kaisa/
Web Metrics a research Paper by (c) Jukka Heinonen, Marcus Hägert 2004

[3]. Institutional Web Management Workshop 2002: The Pervasive Web


http://www.ukoln.ac.uk/web-focus/events/workshops/webmaster-2002/

[4]. A Quality Framework For Web Site Quality


www.ukoln.ac.uk/web-focus/papers/www2005

[5] R.S. Pressman & Associates Inc.

[6] A Survey of Web Metrics [ Bibliography of Web metrics]


http://www.nsdl.comm.org/
National Science Digital Library
DEVANSHU DHYANI and NG WEE KEONG & SOURAV S BHOWMICK
Nanyang Technological University

[7] A Concerns-based Metrics Suite for Web Applications


Dipartimento di Informatica e Comunicazione
Università degli Studi di Milano
Via Comelico 39, 20135 Milano, Italy
Alessandro.Marchetto@unimi

Page 48 of 50
Case Study
Phase III

[8] A Survey of Web Metrics


DEVANSHU DHYANI and NG WEE KEONG and SOURAV S BHOWMICK
Nanyang Technological University

[9] http://www.clickz.com/showPage.html?page=992351

[10] Web Site Usability, Design, and Performance Metrics


Jonathan W. Palmer
University of Maryland, R. H. Smith School of Business, Decision and
Information Technologies, 4348Van Munching
Hall, College Park, Maryland 20742-1871
jpalmer@rhsmith.umd.edu
[11] ccnmtl.columbia.edu/projects/qmss/anova_about.html
[12] searchsoftwarequality.techtarget.com/generic/0,295582,sid92.html
[13] http://shiflett.org/articles/the-truth-about-sessions
[14] http://msdn2.microsoft.com/en-us/library/ms998581.aspx
[15] http://www.useit.com/jakob/
[16] http://www-306.ibm.com/software/awdtools/appscan/standard/features/?S_CMP=wspace

Web Metrics Related References

1. Website Performance
http://www.hurolinan.com/resources/resource.asp?LocatorCode=416
http://searchsoftwarequality.techtarget.com/originalContent/0,289142,sid92_gci12
60130,00.html

Page 49 of 50
Case Study
Phase III
2. Web Security
http://www.webopedia.com/TERM/S/security.html
http://www.websense.com/global/en/ResourceCenter/Glossary/web-
security.php
http://www.arctecgroup.net/pdf/0703-OWASPMetrics.pdf
http://www.securitymetrics.org/content/attach/Metricon2.0/Grossman_Metrico
n_2.pdf

3 Usability
http://www.useit.com/alertbox/20030825.html
www.ccs.neu.edu/home/tarase/vita.htm
http://sigchi.org/chi97/proceedings/sig/jms.htm

Page 50 of 50

Vous aimerez peut-être aussi