Académique Documents
Professionnel Documents
Culture Documents
A CASE STUDY ON
Presented To:
Dr. Qaiser S. Durrani
Presented By:
Muhammad Irfan Khan (07L-0858)
Saqib Aziz (07L-0861)
Ahmad Mohsin (07L-0852)
Case Study
Phase III
Table of Contents
Page 2 of 50
Case Study
Phase III
Research Results and Data Analysis........................................................................................27
Discussion of Findings with Statistical Analysis.....................................................................40
Conclusion...............................................................................................................................46
Future Work.............................................................................................................................47
References (Revised)...............................................................................................................48
[4]. A Quality Framework For Web Site Quality ..............................................................48
[5] R.S. Pressman & Associates Inc..................................................................................48
Page 3 of 50
Case Study
Phase III
Within a short period, the Internet and World Wide Web have become ubiquitous,
surpassing all other technological developments in our history. They have also
grown rapidly in their scope and extent of use, significantly affecting all aspects of
our lives. Industries such as manufacturing, travel and hospitality, banking,
education, and government are Web-enabled to improve and enhance their
operations. E-commerce has expanded quickly, cutting across national boundaries.
Even traditional legacy information and database systems have migrated to the
Web[1]. Advances in wireless technologies and Web-enabled appliances are
triggering a new wave of mobile Web applications. As a result, we increasingly
depend on a range of Web applications. Now that many of us rely on Web based
systems and applications, they need to be reliable and perform well. To build these
systems and applications, Web developers need a sound methodology, a
disciplined and repeatable process, better development tools, and a set of good
guidelines. The emerging field of Web engineering fulfils these needs. It uses
scientific, engineering, and management principles and systematic approaches to
successfully develop, deploy, and maintain high-quality Web systems and
applications. It aims to bring the current chaos in Web based system development
under control, minimize risks, and enhance Web site maintainability and quality.
Page 4 of 50
Case Study
Phase III
The main objectives of this study are:
Page 5 of 50
Case Study
Phase III
Businesses these days are employing web as an integral part of their strategies to
make their products available worldwide. This led to an increase in the number of web
applications deployed for business expansion. With this increase, quality has become
the most overlooked aspect of web products. Due to global reach, a small compromise
in quality has serious implications on the reputation of the business. This demands
formulation of a standard framework to ensure the quality of online applications.
We intend to study and analyze the Quality Assurance related issues with regard to web
as quality is always not given a too much importance in the whole SDLC. But now
industry is grooming and understanding of Quality Assurance throughout the life cycle of
a product under construction has improved a lot. When we talk about Web Quality, Web
Metrics come into play. Web Metrics and Quality assurance are closely related to each
other. In our problem statement we will be defining a framework for Quality of web
Applications. Metrics, as we know, refer to standards of measurement. Therefore, web
metrics are standardized ways of measuring something that relates to the Web.
Here in our case study we are focusing on Web Applications and to more precise we
will be using CRM Applications and mapping of these metrics to evaluate Quality.
We have decided to get main data from a renowned Software House Mindshare
Solutions as it specializes in CRM applications. The problem is that they have
introduced three demo versions of their CRM application named as Unify CRM. But
results reported are that it bears a low quality as for as overall performance of the
product is concerned. We decided to have a comprehensive comparisons of Unify CRM
with other renowned open Source CRMs and evaluate it on the basis of our Web
Metrics Frame work in which we will be focusing more on the Performance, Usability
and Security and see the Quality of the products in association with these Quality
metrics for Web Applications.
When considering the Web, it becomes clear that there is an abundance of different
things we can measure. As an example, consider web traffic, while it certainly is
Page 6 of 50
Case Study
Phase III
possible to review the entire Web and how "busy" it is. [2], consider web page similarity
such as executed by Google and other search engines if not performed with a wide
(meaning we only consider more than a small amount of pages, such as those of a
particular site), its practical usefulness is likely to be limited. In other words different
metrics can be applied on different views.
Page 7 of 50
Case Study
Phase III
must incorporate proper performance, security, usability and other measures and
should perform efficiently. Similarly critical systems need to be quality assured
against breakdown and should have an efficient backup flows.
We will be defining a framework for Quality of web Applications. Metrics refer to
standards of measurement. Therefore, web metrics are standardized ways of measuring
something that relates to the Web Applications Quality.
Web metrics do play a very important role in determining the actual characteristics
related to the web applications functionalities. Here in our study we have identified
key web metrics that actually effect the Quality of Web Applications.
To support our study in a more specific way we decided to select few of the Web
metrics and to check them on Web Applications to see the impact of these on
Quality Assurance. Being more realistic we chose different CRM Applications to
check the impact of these metrics related to Quality Assurance.
Following metrics will be used in order to determine the extent to which CRM
Applications adhere to Quality:
- Performance
- Security
- Usability
Although at our initial study phase we had identified few other metrics as well but we
considered above mentioned metrics to be more precise to our case study.
Scope
This study encompasses key web QA metrics that can be used to ensure the quality of
Web Applications and specifically to CRM Applications. Following are the important key
areas:
1- Web Traffic
Page 8 of 50
Case Study
Phase III
Web traffic can be about number of hits a site has during any given interval, but
can also be measured in the amount of data that is transferred during a similar
interval. It can be useful in determining popularity, and through trend analysis also
in estimating future needs.
3- Accessibility
Accessible design principles often results in a significant overall increase in
the usability of a Web site - in terms of faster completion of tasks, with lower
error rates, and more effective retention of knowledge of the site by repeat
users.
4- Relevance
Relevance is a direct measure of how well a particular page satisfies the
information need of some user, typically expressed as a set of query words.
As the Web grows exponentially, it seems logical to assume that the number
of documents that contain the same query words is typically also increasing.
The need for metrics that can order all such documents so that those that are
most relevant can be examined first is thus greater than ever.
Page 9 of 50
Case Study
Phase III
7- Performance
A fast website will increase the user experience very much and this brings
returning visitors.
8- Security
Online security is perhaps overlooked most often in local software houses,
websites with poor security implementations will invariably damage users and
the business.
9- Ease of Use
Quality issues regarding the ease of use of a web application are important in
that they help a business to retain their clientele. Also, such applications are
easier to maintain and change.
10- Portability
With a growing range of computer hardware and software platforms, it is
important for ecommerce applications to be able to perform consistently and
provide similar functionality in different computing environments.
11- Reliability
As with traditional software, reliability is always an important quality issue for
users. A system application should always produce consistent results and
outputs for a given fixed input. Otherwise the application can not be trusted
for high quality service.
Page 10 of 50
Case Study
Phase III
Prior Literature (Revised)
Quality Assurance is an important step in the website development process and, by
all means, should not be skipped. A broken link or a misspelled word may seem like
trivial mistakes, but they can greatly undermine the credibility of your website. You
want people who visit your site to feel assured of the quality of the information they
find.
As an emerging discipline, Web engineering actively promotes systematic,
disciplined and quantifiable approaches towards successful development of high-
quality, ubiquitously usable Web-based systems and applications [3]
A simple definition in the context of quality for Web sites is that ‘quality is meeting
requirements’. This definition works because by creating technical specifications and
requirements that describe various attributes of a Web site as well as how it should
function you have set yourself goals to achieve and have determined specific
indicators of quality. Quality can then be measured by testing various aspects of
your Web site and the complex relationships between all areas of the site at
intervals.
The current World Wide Web has many flaws, with a great many resources failing to
comply with. As we move towards a richer, more structure Web, it will be essential
that quality assurance is built into development processes – unlike HTML, XML
applications formally require string adherence with the standards and may fail to
render if this is not the case. However, even when a resource does comply with
standards it does not mean that the user experience will necessarily be a happy one.
Thus, a combination of supplier QA and user satisfaction assessment are needed.
However, linking the subjective perceptions of users with the QA practices of
suppliers is not a simple task. The next stage of work is to model the relationships
between user satisfaction and supplier initiatives (such as QA procedures). One way
in which this might be done is through quality function deployment (QFD): “a
structured and disciplined process that provides a means to identify and carry the
voice of the customer through each stage of product and or service development
and implementation” [4].
Page 11 of 50
Case Study
Phase III
Software process and product metrics are quantitative measures that enable
software people to gain insight into the efficacy of the software process and the
projects that are conducted using the process as a framework. Basic quality and
productivity data are collected. These data are then analyzed, compared against
past averages, and assessed to determine whether quality and productivity
improvements have occurred. [5]
The Internet and the world wide web (WWW or simply the web) are some specific
example of general heterogeneous systems. QA for these systems is gaining
importance.[6]
Web applications have become very complex and crucial, especially when combined
with areas such as CRM (Customer Relationship Management) and BPR (Business
Process Reengineering). The scientific community has focused attention to Web
applications design, development, analysis, and testing, by studying and proposing
methodologies and tools [7]
Given the organic growth of the Web, we require new metrics that provide deeper
insight on the Web as a whole and also on individual sites from different
perspectives. Arguably, the most important motivation for deriving such metrics is the
role they can play in improving the quality of information available on the Web. [8]
When you can measure what you are speaking about, and express it in numbers,
you know something about it; but when you cannot express it in numbers, your
knowledge is of a meager and unsatisfactory kind; it may be the beginning of
knowledge, but you have scarcely in your thoughts advanced to the state of
science.[9]
Metrics help organizations generate more effective Web sites and provide measures
that understand and that academics can replicate and analyze. To provide practical
value, metrics should identify frequency of measurement, frequency of review,
source of data, rationale for introducing the measure, who will act on the data, and
the purpose of the measure (Neely 1998). For scientific, quantitative rigor, metrics
should exhibit, at a minimum, construct validity and
reliability (Straub 1989, Cook and Campbell 1979).[10]
Page 12 of 50
Case Study
Phase III
For effective measurement, the measurement activity should have clear objectives
and identify types of attributes that can be measured and appropriate scales.
Empirical relations for an ttribute should be identified in advance and the
measurement scale must be meaningful. [10]
Page 13 of 50
Case Study
Phase III
projects. Our main aim is to identify a set of key quality aspects and then formulate a
workable framework for the quality metrics thus identified. Our research
methodology is a systematic process to achieve the objectives of carrying out this
study. The methodology that was formulated for case study is briefly explained.
Selection of CRMs
We have selected certain Customer Relationship Management (CRM) systems with
the focus on QA metrics. Our selection consists of a mix of open-source and
proprietary CRMs. We have selected four systems:
2. Salesforce CRM
(http://www.salesforce.com/)
http://www.sugarcrm.com/crm/)
Salesforce CRM
Page 14 of 50
Case Study
Phase III
The proven leader in on-demand customer relationship management (CRM),
salesforce.com empowers customers to stand out from the crowd. We do so by
delivering the most innovative technology and making it as easy as possible to share
and manage business information. Our solutions combine award-winning
functionality, proven integration, point-and-click customization, global capabilities,
and the best user experience and the result is CRM success. That's why Salesforce
has earned the trust of its customers and a customer success rate of
95%.Salesforce SFA enables companies to drive sales productivity, increase
visibility, and expand revenues with an affordable, easy-to-deploy service that
delivers success to companies of all sizes.
The Salesforce solution for customer service gets companies up and running in a
matter of weeks with a call center application that is loved by agents and a customer
self-service application— powered by Web 2.0—that generates new levels of
customer loyalty.
Marketing
Page 15 of 50
Case Study
Phase III
Salesforce Marketing enables closed-loop marketing to execute, manage, and
analyze the results of multichannel campaigns. Marketing executives can measure
the ROI of their budgets, tie revenue back to specific marketing programs, and make
adjustments in real time.
Content
Salesforce Content brings Web 2.0 usability to your business content so you can
share it more effectively and enhance collaboration within your organization.
Empower employees to find the exact documents they need, right from the business
applications they use on a daily basis.
Analytics
Salesforce Analytics empowers business users at every level to gain relevant insight
and analysis. With real-time reporting, calculations, and dashboards, businesses can
optimize performance, decision making, and resource allocation.
Custom Applications
Page 16 of 50
Case Study
Phase III
Industry Applications
AppExchange Applications
It provides sales reps the capability to develop accurate forecasts, seamlessly share
information across sales teams, and configure products and services to meet the
unique needs of each customer. Its simple user interface is designed to improve
sales rep productivity, yet support best practices across the entire sales
organization. Sales Management's key features and capabilities include:
Improve forecast accuracy - The sales pipeline is continually updated in real time so
that everyone in your organization is provided with a clear view, allowing resources
to be focused accordingly.
Page 17 of 50
Case Study
Phase III
Close more deals - Sales process visibility enables each member of your sales team
to know precisely what the other teammates are working on, allowing them to
collaborate to transform prospects into profitable customers.
Shorten and standardize unique sales cycles - Because different channels require
different sales processes, Sales Manager allows an unlimited number of unique
sales methodologies or sales process to be created. Teams, both inside and outside
your organization can effectively work together to close accounts by scheduling
events, assigning tasks, coordinating meetings, flagging new opportunities, and
updating client files on every account.
Enable collaborative and consistent customer management - Real-time, secure
access to detailed account data enables you and your channel partners to
collaborate with sales, customer service & support, and marketing personnel. With
instant access to all communication, including email, notes, calls, resolutions, and
more, you can collectively manage customer relationships across your entire
extended enterprise.
Recognize "big picture" market trends - With Sourceforge flexible reporting system,
it's easy to review and analyze sales data, both current and historical, allowing sales
management to spot changes in customer behavior or shifts in key market
indicators. Armed with a comprehensive contextual view of both past and current
events, your sales organization can respond to evolving customer needs and
economic condition
Page 18 of 50
Case Study
Phase III
User Interface : Web-based
Project UNIX name : egs
Registered : 2003-05-22 06:18
All CRMs will be passed through metrics framework and data will be collected
against different parameters of quality. Data would be entered into statistical
software. We will use SPSS 15.0 for statistical analysis of our data. We intend to find
Page 19 of 50
Case Study
Phase III
out if there is a significant difference in the quality of the selected CRMs in terms of
performance, security and usability. It can be accomplished with the application of a
statistical test. As the total number of CRMs being used are four(4), we will use
ANOVA test. [11]
Organization Information
53
Size
Expertise Enterprise Solutions Provider
Project Information
4
No. of Projects
under Study
Domain of Projects Customer Relationship
Management System
Communication Project Manager of corresponding
Person Organization’s Quality Team.
Table – 1
Interpretation of Data
After the statistical analysis of the data, we will describe the results of draw
inferences and recommend a course of action about the quality areas where Unify
CRM needs improvements.
Page 20 of 50
Case Study
Phase III
Hypotheses:
For the purpose of analytical study, we have assumed following hypothesis
Hypothesis I
This hypothesis enables us to evaluate the impact of our framework’s performance
metrics on the quality of a web application.
Page 21 of 50
Case Study
Phase III
Alternate Hypothesis H1 Perf
Performance metrics in our framework have a significant effect on the quality
of a web application.
Hypothesis II
Through this hypothesis we can evaluate the impact of our framework’s security
metrics on the quality of a web application.
Hypothesis III
This hypothesis helps us evaluate the impact of our framework’s usability metrics on
the quality of a web application.
Page 22 of 50
Case Study
Phase III
Mathematical Description of Hypotheses (Revised)
For this study, we have selected the value of significance as α = 0.05. This is the
probability of rejecting the Null hypothesis, when the Null hypothesis is true i.e. there
are 5 in 100 chances that the Null hypothesis is rejected
Hypotheses that were formulated with the hope that they be rejected led to the use
of the term null hypothesis. Today this term is applied to any hypothesis we wish to
test and is denoted by Ho. The rejection of Ho leads to the acceptance of an
alternative hypothesis, denoted by H1.
In the hypotheses, the performance, security and usability are the independent
variables, and quality is the dependent variable. This is because quality depends on
metrics in the framework.
Security Metrics
Following are the security parameters to evaluate quality of the CRM application
that to how much extent security is affecting the overall Quality of the proudct.
1. SQL injection
2. Cross-Site Scripting Attacks
3. Session Hijacking
4. Denial of Service
5. Buffer Overflows
SQL injection:
SQL injection is a type of security exploit in which the attacker adds Structured
Query Language (SQL) code to a Web form input box to gain access to resources or
Page 23 of 50
Case Study
Phase III
make changes to data. An SQL query is a request for some action to be performed
on a database. [12]
Cross-site scripting (XSS) is a security exploit in which the attacker inserts malicious
coding into a link that appears to be from a trustworthy source. When someone
clicks on the link, the embedded programming is submitted as part of the client's
Web request and can execute on the user's computer, typically allowing the attacker
to steal information.
Denial of Service
Services Detail
SYN Attack When a session is initiated between the Transport Control
Program (TCP) client and server in a network, a very small
buffer space exists to handle the usually rapid "hand-shaking"
exchange of messages that sets up the session.
Teardrop This type of denial of service attack exploits the way that the
Attack Internet Protocol (IP) requires a packet that is too large for the
next router to handle be divided into fragments.
Smurf Attack In this attack, the perpetrator sends an IP ping (or "echo my
message back to me") request to a receiving site The ping
packet specifies that it be broadcast to a number of hosts within
the receiving site's local network.
Viruses Computer viruses, which replicate across a network in various
ways, can be viewed as denial-of-service attacks where the
Page 24 of 50
Case Study
Phase III
victim is not usually specifically targetted but simply a host
unlucky enough to get the virus.
Physical Here, someone may simply snip a fiber optic cable. This kind of
Infrastructure attack is usually mitigated by the fact that traffic can sometimes
Attacks quickly be rerouted.
Table – 2 (Security Metric Measures)
Buffer Overflows
A buffer overflow occurs when a program or process tries to store more data in a
buffer (temporary data storage area) than it was intended to hold. Since buffers are
created to contain a finite amount of data, the extra information - which has to go
somewhere - can overflow into adjacent buffers, corrupting or overwriting the valid
data held in them. Although it may occur accidentally through programming error,
buffer overflow is an increasingly common type of security attack on data integrity.
Session Hijacking
Session hijacking, also known as TCP session hijacking, is a method of taking over
a Web user session by surreptitiously obtaining the session ID and masquerading as
the authorized user. Once the user's session ID has been accessed (through
session prediction), the attacker can masquerade as that user and do anything the
user is authorized to do on the network.
Usability Metrics
Page 25 of 50
Case Study
Phase III
To determine if a system meets user/human-centered design goals of effectiveness,
efficiency, and user satisfaction, it is necessary to collect objective, quantifiable
data. The Human Factors Specialist develops experimental plans and data
collection procedures to obtain and statistically analyze measures of usability. Some
of the most frequently used metrics for this purpose are listed by goal as follows: [13]
• Effectiveness
o Training time
o Time to reach proficiency
o Number of commands/actions per task
o Number of commands/features that are never used
o Number of times "help" is accessed
• Efficiency
o Time to complete a task
o Error rate
o Number of tasks completed within a given time
o Error recovery time
o Decision time/delay
• User satisfaction
o Positive statements recorded during observations
o Negative statements recorded during observations
Performance Metrics
Following are the parameters to measure the performance of a CRM application to
evaluate the Quality of the product.[14]
Parameters Detail
Simultaneous browser Number of users using the simultaneous connections
Page 26 of 50
Case Study
Phase III
connections to the server.
Warm up time (secs) The time required to open the first page, first time are
called warp up time.
Total number of requests Total Number of requests sent during one iteration.
Total number of Total Number of active connections to the server
connections during one iteration.
Average requests per Average request in one second during one iteration.
second
Average time to first byte Average time required to download first byte in one
(msecs) millisecond, after sending request.
Average time to last byte Average time required to download last byte in one
(msecs) millisecond, after sending request.
Number of bytes sent Number of bytes sent in one iteration.
(bytes)
Http Errors Numeric value for the http error.
Table – 3 (Performance Metric Measures)
Page 27 of 50
Case Study
Phase III
testing for the metrics defined in our frame work for the four (4) applications or CRM. For
performance, security and usability various testes were conducting using different testing
tools in order to see the results in more proper way. On the basis of these tests
conducted one can foresee the Quality of the CRM applications in terms of Web Metrics
formulated.
PERFORMANCE TESTING
SECURITY TESTING
USABILITY TESING
For these tests to be conducted various testing techniques were used. Now we briefly
focus on these test results and show them in tabular form.
Performance Testing
Although long-duration and high-load stress testing is Application Center Test's main
purpose, the programmable dynamic tests will also be useful for functional
testing.Application Center Test is compatible with all Web servers and Web
applications that adhere to the HTTP protocol.
Page 28 of 50
Case Study
Phase III
To check the quality of the CRM applications we did stress testing as it is supported
in Test center. Stress testing is a form of testing that is used to determine the
stability of a given system or entity. It involves testing beyond normal operational
capacity, often to a breaking point, in order to observe the results. Stress testing may
have a more specific meaning in certain industries.[16]
As we did use Microsoft Application Test center for conducting our tests so here we
discuss some of the features of test center. It uses the features of dynamic testing
techniques. Dynamic tests consist of a script that sends requests to the Web server
during a test run. These tests are called "dynamic" because the request order, the
URL being requested, and many other properties, are determined at the time the test
runs. The code in the test can examine the server's previous response before
creating the properties of the next request. Because most of the request's behavior
is under the control of the script, an understanding of the HTTP protocol is
necessary. If your test will be emulating the behavior of a particular HTTP user
agent, particularly something other than a common Web browser running on a
personal computer, you will need to understand how that user agent behaves.
Note that dynamic tests must run within the Application Center Test UI. This is
required because the program, not the test script, is responsible for tracking and
managing the request properties, such as the HTTP headers and user cookies.
Here we used five categories with respect to concurrent users with the range of
20,40,60,80,100 respectively. We did create lots of tests for performance metric to
assess the quality of the product. We also considered other aspects related to
Quality of the product like Dot Net frame work but as our CRMs had different
platforms so this could not be extended.
All the results have been shown in tabular form show exact measures of the each
parameter used for the performance.
Page 29 of 50
Case Study
Phase III
These are the performance Tables to check the Quality of 4 CRMs.
PERFORMANCE TEST-1
PERFORMANCE TEST-2
Page 30 of 50
Case Study
Phase III
Total number of requests
5 0 0 2
Table – 5 (Performance of CRMs for 40 concurrent users)
PERFORMANCE TEST-3
60 60 60 60
Page 31 of 50
Case Study
Phase III
Warm up time (secs)
10 4 2 7
Table – 6 (Performance of CRMs for 60 concurrent users)
PERFORMANCE TEST-4
Page 32 of 50
Case Study
Phase III
Warm up time (secs)
66 29.5 35 42.9
Average time to first byte (msecs)
22 9 14 17
Table – 7 (Performance of CRMs for 80 concurrent users)
PERFORMANCE TEST-5
Page 33 of 50
Case Study
Phase III
Simultaneous browser
connections
66 29.5 35 42.9
Average time to first byte
(msecs)
1.5 0.6 1.325 0.92
Average time to last byte
(msecs)
5.32 2.43 3.5 4.33
Number of bytes sent/Sec
685485 976065 866632
(bytes)
3 7 5 6738890
HTTP Errors
52 18 25 47
Table – 8 (Performance of CRMs for 100 concurrent users)
PERFORMANCE TEST-6
Page 34 of 50
Case Study
Phase III
Suga
Unify Sales r Enterpri Standard
Performance CRM Force CRM se CRM Value
Warm up time (secs)
72.5 25 25 45 2
Total number of requests
52 18 25 47 100
Table – 9 (Performance Test with random No. of Users)
Page 35 of 50
Case Study
Phase III
As performance parameters have different measuring units. They are in milli
seconds to seconds from bits to bytes so in this order is quite hectic to present
them in such a way. For this reason we assigned all parameters a percentile of
hundred according to which their measures have been shown. We have used
specific standardized values for all the CRM applications to show the accurate
results on the graph.
100%
80%
Enterprise CRM
60% Sugar CRM
0%
Number of
number of
time to first
time (secs)
Warm up
Average
Total
bytes
Security Testing
Page 36 of 50
Case Study
Phase III
• Provides broad application coverage, including Web 2.0/Ajax applications
• Generates advanced remediation capabilities including a comprehensive task list to
ease vulnerability remediation
• Simplifies security testing for non-security professionals by building scanning
intelligence directly into the application
• Features over 40 out-of-the-box compliance reports including PCI Data Security
Standards, ISO 17799, ISO 27001, Basel II, SB 1386 and PABP (Payment
Application Best Practices [17]
It scans and tests for all common Web application containing vulnerabilities including
SQL-Injection, Cross-Site Scripting and Buffer Overflow.
SQL injection 15 0 2 7
Session Hijacking 3 0 0 1
Denial of Service 50 38 42 43
Buffer Overflows 87 35 59 58
Table – 10 (Representation of Security Parameters)
Page 37 of 50
Case Study
Phase III
Graphical Representation of Security Measures in terms of Quality
100
90
80
70 Unify CRM
60
Sales Force
50
40 Sugar CRM
30 Enterprise CRM
20
10
0
Hijacking
injection
Overflows
Session
Buffer
SQL
Usability Testing
• Learnability: How easy is it for users to accomplish basic tasks the first time
they encounter the design?
• Efficiency: Once users have learned the design, how quickly can they
perform tasks?
• Memorability: When users return to the design after a period of not using it,
how easily can they reestablish proficiency?
• Errors: How many errors do users make, how severe are these errors, and
how easily can they recover from the errors?
• Satisfaction: How pleasant is it to use the design?
Page 38 of 50
Case Study
Phase III
There are many other important quality attributes. A key one is utility, which refers
to the design's functionality: Does it do what users need? Usability and utility are
equally important: It matters little that something is easy if it's not what you want. It's
also no good if the system can hypothetically do what you want, but you can't make
it happen because the user interface is too difficult. To study a design's utility, you
can use the same user research methods that improve usability.
Sale
Unif s S
y Forc ugar Enterpri
Security Parameters CRM e CRM se CRM
Training time 80 70 60 65
Number of commands/actions per task 10 55 49 70
Number of commands/features that are never
used 75 62 53 10
Number of times "help" is accessed 89 64 47 20
Time to complete a task 10 59 50 64
Error rate 90 5 20 56
Number of tasks completed within a given time 95 70 50 62
Error recovery time 10 67 52 20
Table – 11 (Usability Metrics)
Page 39 of 50
Case Study
Phase III
100%
80%
60%
Enterprise CRM
40% Sugar CRM
Sales Force
20% Unify CRM
0%
Training time Time to complete a
task
Hypothesis 1
From below table we can see that the Sig. (significant) value against all performance
metrics is less than 0.05 level of significance. This leads us to reject null hypothesis
H0 Perf and accept alternate hypothesis H1 Perf Thus, we can infer that the
performance metrics in the framework have significant impact on the quality of a web
application.
ANOVA - TEST
Sum of Mean
Squares df Square F Sig.
Page 40 of 50
Case Study
Phase III
SimultaneousBrows Between 1767.29
2 883.647 .047 .024
erConnections Groups 4
Within 887289
471 18838.419
Groups 5.206
Total 887466
473
2.500
WarmUpTime Between 647130
323565110
Groups 221571
2 78574660 25.886 .000
493000
00.000
0.000
Within 5874811
12499598
Groups 100031
470 08517442
970000
00.000
0.000
Total 652194
132160
472
347000
00.000
TotalRequests Between 1622.98
2 811.495 165.212 .000
Groups 9
Within 2313.47
471 4.912
Groups 7
Total 3936.46
473
6
TotalConnections Between 894384
44719241
Groups 83925.9 2 434.481 .02
962.971
43
Within 484780
10292571
Groups 11510.3 471
4.459
97
Total 137916
495436. 473
340
Page 41 of 50
Case Study
Phase III
AvgRequestsPerSe Between 179255
89627722
c Groups 44532.2 2 371.106 .005
66.149
97
Within 113753
24151508.
Groups 60433.1 471
351
56
Total 293009
04965.4 473
54
AvgTimeFirstByte Between
6.227 2 3.114 .031 .014
Groups
Within 47872.0
471 101.639
Groups 68
Total 47878.2
473
95
AvgTimeLastByte Between 117490 587453.43
2 69.192 .007
Groups 6.874 7
Within 399889
471 8490.233
Groups 9.936
Total 517380
473
6.810
NumberOfBytesSen Between
4.482 2 2.241 13.763 .003
t Groups
Within
76.699 471 .163
Groups
Total 81.181 473
HttpErrors Between
4.482 2 5.114 13.763 .011
Groups
Within
76.699 471 125.163
Groups
Total 81.181 473
Table - 12 First ANOVA Test
Hypothesis II
Page 42 of 50
Case Study
Phase III
We can clearly see below that the Sig. (significant) value against all security metrics
is less than 0.05 level of significance which means we can reject hypothesis H0 Sec
and accept alternate hypothesis H1 Sec Thus, it is inferred that the security metrics in
the framework have significant impact on the quality of a web application.
ANOVA - TEST
Sum of
Square Mean
s df Square F Sig.
SQLinjection Between
.604 2 .302 .377 .032
Groups
Within
15.214 19 .801
Groups
Total 15.818 21
CrossSiteScripting Between
.286 2 .143 .173 .004
Groups
Within
15.714 19 .827
Groups
Total 16.000 21
SessionHijacking Between
.073 2 .037 .138 .002
Groups
Within
5.018 19 .264
Groups
Total 5.091 21
DenialOfService Between
.006 2 .003 .014 .021
Groups
Within
4.357 19 .229
Groups
Total 4.364 21
Page 43 of 50
Case Study
Phase III
BufferOverflows Between
63.955 2 31.977 .752 .014
Groups
Within 807.50
19 42.500
Groups 0
Total 871.45
21
5
Table – 13 Second ANOVA Test
Hypothesis III
As from below table, it is clear that Sig. (significant) value against all usability metrics
is less than 0.05 level of significance which means we can reject hypothesis H0 Usab
and accept alternate hypothesis H1 Usab Thus, we can infer that the usability metrics
in the framework have significant impact on the quality of a web application.
ANOVA - TEST
Sum of Mean
Squares df Square F Sig.
TrainingTime Between
.286 2 .143 .173 .031
Groups
Within
15.714 19 .827
Groups
Total 16.000 21
PorifciencyTime Between
63.955 2 31.977 .752 .009
Groups
Within
807.500 19 42.500
Groups
Total 871.455 21
Page 44 of 50
Case Study
Phase III
ErrorRate Between
9.657 2 4.829 .684 .006
Groups
Within
134.161 19 7.061
Groups
Total 143.818 21
ErrorRecoveryTime Between
2.255 2 1.127 .931 .002
Groups
Within
23.018 19 1.211
Groups
Total 25.273 21
DecisionTime Between
16.578 2 8.289 1.578 .232
Groups
Within
99.786 19 5.252
Groups
Total 116.364 21
PositieStatements Between
3.448 2 1.724 .367 .000
Groups
Within
89.143 19 4.692
Groups
Total 92.591 21
NegativeStatements Between
4.294 2 2.147 .479 .004
Groups
Within
85.161 19 4.482
Groups
Total 89.455 21
ActionsPerTask Between
.604 2 .302 .377 .090
Groups
Within
15.214 19 .801
Groups
Total 15.818 21
FeaturesNeverUsed Between
.073 2 .037 .138 .000
Groups
Within
5.018 19 .264
Groups
Total 5.091 21
Page 45 of 50
Case Study
Phase III
TaskCompletionTim Between
.006 2 .003 .014 .002
e Groups
Within
4.357 19 .229
Groups
Total 4.364 21
Table – 14 Third ANOVA Test
Conclusion
Based on the study conducted for the web applications in general and CRM
applications in particular driven from the Web Metrics Framework , we have drawn
following conclusions about the improvement of CRM application for the organization
under study as compared to open source CRM applications.
3. From our studies we come into conclude that Salesforce CRM is more
effective in terms of performance, security and usability. Second is the
Sourceforge CRM which also shows good results when these measures are
tested. Enterprise CRM is at third level in terms of overall Quality and forth
one is the Unify CRM from a local software House.
4. Quality Manager of the Software House has been provided with a set of
Metrics to improve their overall productivity at individual levels as well as at
team performance levels.
5. The metrics framework is not specific to only CRM applications this can also
be applied to other web based applications for improving the Quality.
Page 46 of 50
Case Study
Phase III
Based on the statistical analysis of data collected from the organization, we have
concluded that:
2. It is now become clear that why their first three versions of Unify CRM had
been failed just because in the absence of proper measures.
4. The post-releases issues have no relationship with the software defects after
release of the software.
Future Work
In this study, we presented three hypotheses, which were related to Web metrics
frame work model. Next step is to apply the framework on the application, and
finally, we have to study the effect of proposed Web Metrics framework on the
software overall Quality. This test is the limitation in this study, and in our future
studies, we will provide the quality framework analysis on other applications. More
work can be done on the other metrics related to the Quality of web applications. We
intend to improve upon this frame work and test it on other web applications as well.
Page 47 of 50
Case Study
Phase III
References (Revised)
[1] Athula Ginige and San Murugesan, "Web Engineering: An Introduction," IEEE
Multimedia, Vol. 8, No. 1, January 2001, pp 14-18
[2] http://www.abo.fi/~kaisa/
Web Metrics a research Paper by (c) Jukka Heinonen, Marcus Hägert 2004
Page 48 of 50
Case Study
Phase III
[9] http://www.clickz.com/showPage.html?page=992351
1. Website Performance
http://www.hurolinan.com/resources/resource.asp?LocatorCode=416
http://searchsoftwarequality.techtarget.com/originalContent/0,289142,sid92_gci12
60130,00.html
Page 49 of 50
Case Study
Phase III
2. Web Security
http://www.webopedia.com/TERM/S/security.html
http://www.websense.com/global/en/ResourceCenter/Glossary/web-
security.php
http://www.arctecgroup.net/pdf/0703-OWASPMetrics.pdf
http://www.securitymetrics.org/content/attach/Metricon2.0/Grossman_Metrico
n_2.pdf
3 Usability
http://www.useit.com/alertbox/20030825.html
www.ccs.neu.edu/home/tarase/vita.htm
http://sigchi.org/chi97/proceedings/sig/jms.htm
Page 50 of 50