Vous êtes sur la page 1sur 35

Comparative investment analysis of mutual funds

Name : R. Divya

Hall Ticket. No : 172E1E00A7

Master Of Business Administration

College Name : Vaageshwari Institute of Management Sciences


Chapter- III
Theoretical Framework
Introduction
With the entry of private sector funds in 1993, a new era started in the
Indianmu t u a l f u n d i n d u s t r y, g i v i n g t h e I n d i a n i n v e s t o r s a wi d e r
c h o i c e o f f u n d families.Also, 1993 was the year in which the first Mutual
Fund Regulations came intobeing, under which all mutual funds, except
andg o v e r n e d . T h e e r s t w h i l e K o t h a r i P i o n e e r ( n o w m e
r g e d w i t h F r a n k l i n Templeton) was the first private sector mutual fund
registered in July 1993.

Fourth Phase – since February 2003

In February 2003, following the repeal of the Unit Trust of India Act 1963
UTIwas bifurcated into two separate entities. One is the Specified Undertaking
of the Unit Trust of India with assets under management of Rs.29,835 crores
asa t t h e e n d o f J a n u a r y 2 0 0 3 , r e p r e s e n t i n g b r o a d l y , t h e
US64s c h e me , a s s u r e d r e t u r n a n d c e r t a i n o t h e r s c h e m
e s . T h e S p e c i f i e d Undertaking of Unit Trust of India, functioning
under an administrator andunder the rules framed by Government of India
and does not come under thepurview of the Mutual Fund Regulations.The
second is the UTI Mutual Fund Ltd, sponsored by SBI, PNB, BOB
andL I C . I t i s r e g i s t e r e d w i t h S E B I a n d f u n c t i o n s u n d
e r t h e M u t u a l F u n d R e g u l a t i o ns . Wi t h t h e b i f u r c a t i o n o f t h e
e r s t wh i l e U TI whi c h h a d i n M a r c h 2000 more than Rs.76,000 crores
of assets under management and with thes e t t i n g u p o f a U T I
Mutual Fund, conforming to the SEBI Mutual
F u n d Regulations, and with recent mergers taking place among
different privates e c t o r f u n d s , t h e m u t u a l f u n d i n d u s t r y h a s
e n t e r e d i t s c u r r e n t p h a s e o f consolidation and growth. As at the
end of September, 2004, there were 29funds, which manage assets of
Rs.153108 crores under 421 scheme.
While the Indian mutual fund industry has grown in size by about 320%
fromMarch, 1993 (Rs. 470 billion) to December, 2004 (Rs. 1505 billion) in
terms of AUM, the AUM of the sector excluding UTI has grown over 8 times
from Rs.152 billion in March 1999 to $ 148 billion as at March 2008.
Many nationalized banks got into the mutual fund business in the
earlyn i n e t i e s a n d g o t o f f t o a s t a r t d u e t o t h e s t o c k m a r k e t
b o o m w a s p r e v a i l i n g . Th e s e b a n k s d i d n o t r e a l l y u n d e r s t a n d
t h e mu t u a l f u n d business and they just viewed it as another kind of
banking activity.Few hired specialized staff and generally chose to
transfer staff fromt h e p a r e n t o r g a ni z a t i o n s . Th e p e r f o r ma n c e o f
mo s t o f t h e s c h e me s f l o a t e d b y t he s e f u n d s wa s n o t g o o d . S o me
s c h e me s h a d o f f e r e d guaranteed returns and their parent organizations had
to bail out theseAMCs by paying large amounts of money as a difference
between theguaranteed and actual returns. The service levels were also very
bad.M o s t o f t h e s e A M C s h a v e n o t b e e n a b l e t o r e t a i n
s t a f f , f l o a t n e w schemes etc.
Components of the Study

Research can be classified into two categories: Basic research, which is done
in a lab or a clinical setting and applied research, which is done with real
subjects in real-world situations. And from these categories of research, we
have the following general types of studies:

Animal Study: An animal or in vivo study is a study in which animals are used
as subjects. A common use of an animal study is with a clinical trial (see
below) and as a precursor to evaluating a medical intervention on humans.
However, it is critical to recognize that results from animal studies should not
be extrapolated to draw conclusions on what WILL happen in humans.

Case Study: A case study provides significant and detailed information about a
single participant or a small group of participants. “Case studies are often
referred to interchangeably with ethnography, field study, and participant
observation.” Unlike other studies which rely heavily on statistical analysis,
the case study is often undertaken to identify areas for additional research and
exploration.

Clinical Trial Study: A clinical trial study is often used in the areas of health
and medical treatments that will presumably yield a positive effect. Typically a
small group of people or animals are selected based upon the presence of a
specific medical condition. This group is used to evaluate the effectiveness of a
new medication or treatment, differing dosages, new applications of existing
treatments. Due to the risk involved with many new medical treatments, the
initial subjects in a clinical trial may be animals and not humans. After positive
outcomes are obtained, research then can proceed to a human study where the
treatment is compared against results from the existing.

Correlational Study: Correlational studies evaluate the relationship between


variables and determine if there is a positive correlation, a negative correlation,
or no correlation. Please note, a positive correlation does not mean one thing
causes another. Correlational studies are typically used in naturalistic
observations, surveys, and with archival research.
Cross-sectional Survey: Also know as the synchronic study, a cross-sectional
survey collects data at a single point in time but the questions asked of a
participant may be about current and past experiences. They are often done to
evaluate some aspect of public health policy.

Epidemiological Study: Epidemiological studies evaluate the factors and


associations linked to diseases. Types of epidemiological studies include case
series studies, case control studies, cohort studies, longitudinal studies, and
outbreak investigations.

Epidemiological studies are often beneficial in identifying areas for a more


control research evaluation; however all to often, readers of epidemiological
research miscategorize links and associations as causes. In addition, a common
problem with epidemiological studies is that they rely on memory recall which
can be quite unreliable.

Experimental Study: In an experimental study, specific treatments are applied


to a sample or group and the results are observed.

Literature Review: A literature review is an exhaustive search of all of the


relevant literature related to a specific research topic.

Longitudinal Study: A specific type of epidemiological study, the longitudinal


study follows subjects over a long period of time, asking a specific research
question with repeated samples of data gathered across the duration of the
study. These studies are often used as the basis for specific experimental
studies. For example, the Framingham Heart Study has evaluated people from
the town of Framingham, Massachusetts since 1948 looking for patterns in heart
disease.

Meta-analysis: A meta-analysis is a statistical process in which the results of


multiple studies evaluating a similar research objective are collected and pooled
together. They are often used to determine the effectiveness of healthcare
interventions and experiments.
Content

A mutual fund is a professionally managed investment fund that pools money


from many investors to purchase securities. These investors may be retail or
institutional in nature.
Mutual funds have advantages and disadvantages compared to direct investing
in individual securities. The primary advantages of mutual funds are that they
provide economies of scale, a higher level of diversification, they provide
liquidity, and they are managed by professional investors. On the negative side,
investors in a mutual fund must pay various fees and expenses.
Primary structures of mutual funds include open-end funds, unit investment
trusts, and closed-end funds. Exchange-traded funds (ETFs) are open-end funds
or unit investment trusts that trade on an exchange. Mutual funds are also
classified by their principal investments as money market funds, bond or fixed
income funds, stock or equity funds, hybrid funds or other. Funds may also be
categorized as index funds, which are passively managed funds that match the
performance of an index, or actively managed funds. Hedge funds are not
mutual funds; hedge funds cannot be sold to the general public and are subject
to different government regulations.
Mutual funds are normally classified by their principal investments, as
described in the prospectus and investment objective. The four main categories
of funds are money market funds, bond or fixed income funds, stock or equity
funds, and hybrid funds. Within these categories, funds may be sub-classified
by investment objective, investment approach or specific focus.
The types of securities that a particular fund may invest in are set forth in the
fund's prospectus, a legal document which describes the fund's investment
objective, investment approach and permitted investments. The investment
objective describes the type of income that the fund seeks. For example, a
capital appreciation fund generally looks to earn most of its returns from
increases in the prices of the securities it holds, rather than from dividend or
interest income. The investment approach describes the criteria that the fund
manager uses to select investments for the fund.
Bond, stock, and hybrid funds may be classified as either index (or passively-
managed) funds or actively managed funds.
Inflation adjustment is a very important point while comparing mutual funds
and fixed deposits. FDs don't come with inflation adjustment guarantees, and if
the interest rate is lower than the inflation rate, you actually end up losing the
value of your money. In the FY 2011-12, the inflation rate in India was 7%,
while the interest rate for around 1 year tenure was something around 7% as
well [6.5% for ICICI and HDFC banks, 6.75% for Citibank and HSBC, 7.10%
for Axis and Yes Bank and so on. Higher rates are there, but for lump-sum
investments like 1 crore. Thus, if you have invested in bank FDs for the last FY,
you either failed to beat inflation or ended up with minimal inflation adjusted
positive returns. On the other hand, at least half a dozen mutual funds yielded
returns greater than 8% (some as high as 12-14%), thereby giving you
handsome inflation adjusted returns. Usually, mutual funds outrun inflation and
always give positive, real returns. Mutual funds and fixed deposits: Capital
appreciation When it comes to capital appreciation, mutual funds are better than
fixed deposits, because of the equity investment. In longer time periods, market
changes result in increasing interest rates. And, your mutual funds manager is
there with all the expertise and professionalism to ensure a better capital
appreciation.
The developing countries like India face the enormous task of finding sufficient
capital in their development efforts. Most of these countries find it difficult to
get out of the vicious circle of poverty of low income, low saving, low
investment, low employment etc. With high capital output ratio, India needs
very high rates of investments to make leap forward in her efforts of attaining
high levels of growth. Since the beginning of planning, the emphasis was on
investment as the primary instruments of economic growth and increase in
national income. In order to have production as per target, investment was
considered the crucial determinant and capital formation had to be supported by
appropriate volume of saving. Investment is the sacrifice of certain present
value for the uncertain future reward. Investments are always interesting,
challenging and rewarding. Generally where there is a high risk, more rate of
return is assured. Risk and reward go together. The major features of an
investment are safety of principal amount, liquidity, income stability,
appreciation and easy transferability. A variety of investment avenues are
available such as shares, bank, companies, gold and silver, real estate, life
insurance, postal savings and so on. All the investors invest their surplus money
in the above mentioned avenues based on their risk taking attitude.
Factors
You might have come across a plethora of mutual funds offering various
benefits at nominal investment. However, at the first instance, all the funds in a
particular category look similar making it a challenge to take a wise decision.
After all, investing is indeed a long-term venture and you need to be aware of
what you are getting into. Most of the investors utilise fund returns as the only
criteria to measure and compare performance across funds.
Basically, fund returns is the difference between the Net Asset Value at the
beginning and Net Asset Value at the end for a given period of time. Annualised
returns just tell you about the value addition/reduction during a given period.
But what about consistency of returns, quality of fund house and risk-adjusted
returns. For choosing the right fund, you need to compare mutual funds on these
grounds as well. By using some of the financial ratios you will be able to make
the right decision.
Selecting the appropriate mutual funds is the first step towards earning returns
on the expected lines. In order to arrive at the best mutual funds, you need to
possess the art of comparing the funds. All you need is to start with your
investment objective. When you know your goals, you can easily decide what to
look for in a fund. You may use the following parameters to compare a set of
mutual funds:

 Benchmark
It provides a yardstick against which you can measure fund performance.
It indicates how much returns the fund has generated as against how
much it should have delivered. As per SEBI ‘s mandate, each fund
declares its benchmark and considers it as a target to analyse
performance. If index rises by 12% but NAV of the fund rises by 14%,
then the fund is said to have outperformed the index. Conversely, if the
index falls by 10% but the fund loses by 12%, then the fund is said to
have under-performed the index. Basically, comparison should be made
to look for a fund which gains more in a market rally and loses less in a
slump.

 Investment Horizon
Your investment horizon becomes a driving factor for fund selection and
comparison. Investment horizon relates to the time period for which you
stay invested in the given fund. The type of fund chosen for comparison
should be according to the investment horizon. Equity funds are suitable
for a long term horizon of at least 7 years or more. The fund objective in
this time period is wealth accumulation at a relatively high risk.

From this context, the fund returns selected for comparison should match
the investment horizon. It means that while comparing two equity funds,
you may consider fund returns of past 5 to 10 years. In the same way, for
comparing two liquid funds, consider fund returns of past 6 months to 1
year. Select the fund which has given superior performance across
different time intervals.

 Riskiness
Whenever you invest in any mutual fund, you undertake some kind of
risk. This risk relates to variability of fund NAV as per the overall market
movements. According to investment thumb rule, higher risk needs to be
rewarded with higher returns. But plain vanilla returns do not reflect this
aspect of a mutual fund. Thus, you need to use a better measure for
comparing two funds on the grounds of risk-adjusted returns.

You may use alpha and beta for this. These are financial ratios which tell
you about rewarding potential of a mutual fund. Beta tells you how much
risk is involved in investing in a fund. Alpha tells you how much extra
return will the fund generate over and above the underlying benchmark.
See, your target is to beat the benchmark and not to imitate it. Supposes
two funds are at the same level of beta i.e. 1.5. Fund A and Fund B has
alpha of 2 and 2.5 respectively. Fund B is better because it gives higher
risk-adjusted returns.
 Your investment in mutual funds comes at a cost called the expense ratio.
It is an annual fee which the fund house charges to the unitholders to
manage portfolio on their behalf. The level of expense ratio has a direct
impact on the level of fund returns earned by the investors. It is because
expense ratio is charged as a percentage of fund’s asset under
management. A higher expense ratio ultimately dents the profits earned
by the investors. Look for a fund which has the lowest expense ratio in
the given category.
While using the expense ratio, you need to keep a few things in mind.
The expense ratio of direct plans is lower than that of regular plans due to
absence of distributor commission. Compare one regular fund with other
and one direct plan with other. Do not compare index fund with an
actively managed fund. The expense ratio of index fund is lower due to
low fund management fee. Compare active funds with active funds. Do
not compare equity fund with debt fund. Owing to higher transaction cost
and brokerage, equity funds have higher expense ratio than debt funds.

Advantages

 Increased diversification: A fund diversifies holding many securities.


This diversification decreases risk.
 Daily liquidity: Shareholders of open-end funds and unit investment trusts
may sell their holdings back to the fund at regular intervals at a price equal
to the net asset value of the fund's holdings. Most funds allow investors to
redeem in this way at the close of every trading day.
 Professional investment management: Open-and closed-end funds hire
portfolio managers to supervise the fund's investments.
 Ability to participate in investments that may be available only to larger
investors. For example, individual investors often find it difficult to invest
directly in foreign markets.
 Service and convenience: Funds often provide services such as check
writing.
 Government oversight: Mutual funds are regulated by a governmental body.
 Transparency and ease of comparison: All mutual funds are required to
report the same information to investors, which makes them easier to
compare to each other.
The first modern investment funds (the precursor of today's mutual funds) were
established in the Dutch Republic. In response to the financial crisis of 1772–
1773, Amsterdam-based businessman Abraham (or Adriaan) van Ketwich
formed a trust named Eendragt Maakt Magt ("unity creates strength"). His aim
was to provide small investors with an opportunity to diversify.[1][2]
Mutual funds were introduced to the United States in the 1890s. Early U.S.
funds were generally closed-end funds with a fixed number of shares that often
traded at prices above the portfolio net asset value. The first open-end mutual
fund with redeemable shares was established on March 21, 1924 as the
Massachusetts Investors Trust (it is still in existence today and is now managed
by MFS Investment Management).
In the United States, closed-end funds remained more popular than open-end
funds throughout the 1920s. In 1929, open-end funds accounted for only 5% of
the industry's $27 billion in total assets.
After the Wall Street Crash of 1929, the United States Congress passed a series
of acts regulating the securities markets in general and mutual funds in
particular.

 The Securities Act of 1933 requires that all investments sold to the public,
including mutual funds, be registered with the SEC and that they provide
prospective investors with a prospectus that discloses essential facts about
the investment.
 The Securities and Exchange Act of 1934 requires that issuers of securities,
including mutual funds, report regularly to their investors. This act also
created the Securities and Exchange Commission, which is the principal
regulator of mutual funds.
 The Revenue Act of 1936 established guidelines for the taxation of mutual
funds.
 The Investment Company Act of 1940 established rules specifically
governing mutual funds.
These new regulations encouraged the development of open-end mutual funds
(as opposed to closed-end funds).
Growth in the U.S. mutual fund industry remained limited until the 1950s, when
confidence in the stock market returned. By 1970, there were approximately
360 funds with $48 billion in assets.[3]
The introduction of money market funds in the high interest rate environment of
the late 1970s boosted industry growth dramatically. The first retail index fund,
First Index Investment Trust, was formed in 1976 by The Vanguard Group,
headed by John Bogle; it is now called the "Vanguard 500 Index Fund" and is
one of the world's largest mutual funds. Fund industry growth continued into the
1980s and 1990s.
According to Pozen and Hamacher, growth was the result of three factors:

1. A bull market for both stocks and bonds,


2. New product introductions (including funds based on municipal bonds,
various industry sectors, international funds, and target date funds) and
3. Wider distribution of fund shares, including through employee-directed
retirement accounts such as 401(k), other defined contribution
plans and individual retirement accounts(IRAs.) Among the new
distribution channels were retirement plans. Mutual funds are now the
preferred investment option in certain types of fast-growing retirement
plans, specifically in 401(k), other defined contribution plans and
in individual retirement accounts (IRAs), all of which surged in
popularity in the 1980s.[4]
In 2003, the mutual fund industry was involved in a scandal involving unequal
treatment of fund shareholders. Some fund management companies allowed
favored investors to engage in late trading, which is illegal, or market timing,
which is a practice prohibited by fund policy. The scandal was initially
discovered by former New York Attorney General Eliot Spitzer and led to a
significant increase in regulation. In a study about German mutual
funds Gomolka (2007) found statistical evidence of illegal time zone arbitrage
in trading of German mutual funds [5]. Though reported to
regulators BaFin never commented on these results.
Total mutual fund assets fell in 2008 as a result of the financial crisis of 2007–
2008.
Mutual funds today[edit]
At the end of 2016, mutual fund assets worldwide were $40.4 trillion, according
to the Investment Company Institute.[6] The countries with the largest mutual
fund industries are:

1. United States: $18.9 trillion


2. Luxembourg: $3.9 trillion
3. Ireland: $2.2 trillion
4. Germany: $1.9 trillion
5. France: $1.9 trillion
6. Australia: $1.6 trillion
7. United Kingdom: $1.5 trillion
8. Japan: $1.5 trillion
9. China: $1.3 trillion
10.Brazil: $1.1 trillion
In the United States, mutual funds play an important role in U.S. household
finances. At the end of 2016, 22% of household financial assets were held in
mutual funds. Their role in retirement savings was even more significant, since
mutual funds accounted for roughly half of the assets in individual retirement
accounts, 401(k)s and other similar retirement plans.[7] In total, mutual funds are
large investors in stocks and bonds.
Luxembourg and Ireland are the primary jurisdictions for the registration
of UCITS funds. These funds may be sold throughout the European Union and
in other countries that have adopted mutual recognition regimes.
Chapter –IV
Data Analysis& Interpretation
Data Analysis& Interpretation

The purpose of analysing data is to obtain usable and useful information. The
analysis, irrespective of whether the data is qualitative or quantitative, may:
• describe and summarise the data
• identify relationships between variables
• compare variables
• identify the difference between variables
• forecast outcomes
assumption of the qualitative researcher is that the human instrument is capable
of ongoing finetuning in order to generate the most fertile array of data. Morgan
and Krueger (1998:Vol. 6:3-17) on the other hand, provide important views
when they reiterate that the analysis of qualitative methods must be systematic,
sequential, verifiable and continuous. It requires time, is jeopardised by delay, is
a process of comparison, is improved by feedback, seeks to enlighten and
should entertain alternative explanations. As with qualitative methods for data
analysis, the purpose of conducting a quantitative study, is to produce findings,
but whereas qualitative methods use words (concepts, terms, symbols, etc.) to
construct a framework for communicating the essence of what the data reveal,
procedures and techniques are used to analyse data numerically, called
quantitative methods (Sesay, 2011:74). On the whole, regardless of the method
(qualitative or quantitative), cf. par. 1.4.2, p. 13; 1.4.5, p. 16; 1.4.6, p. 17; 5.4.2,
p. 318), the purpose of conducting a study, is to produce findings, and in order
to do so, data should be analysed to transform data into findings. In this study,
data will be analysed using both the qualitative and quantitative method. At this
point in time, one has to take a closer look at both methods of analysis.
Regarding qualitative and quantitative analysis of data, Kreuger and Neuman
(2006:434) offer a useful outline of the differences and similarities between
qualitative (cf. par. 6.2.1, p. 358) and quantitative methods (cf. par. 6.2.2, p.
367) of data analysis. According to these authors, qualitative and quantitative
analyses are similar in four ways. Both forms of data analysis involve:
Inference - the use of reasoning to reach a conclusion based on evidence;
 A public method or process - revealing their study design in some way;
Comparison as a central process – identification of patterns or aspects that are
similar
inferences.
TableTitle: Balance In Personal&Proffesional Life.
Sr. No Attributes No. Of Percentage
Respondents
1. Yes 09 45%
2. No 03 15%
3. To some extent. 08 40%
TOTAL - 20 100%

No. Of Respondents

40% a) Yes
45%
b) No
c) To some extent.

15%

Interpretation:
Here the 40% employees are saying to some extent 45% are saying yes and 15%
are saying no.
We can predict that some of the employees are not able to balance personal and
professional life.
13.The people here are pleasant and co-operative to work with.
a. Yes b. No c. To some extent.

Table Title: Nature Of Colligues.


Sr. No Attributes No. Of Percentage
Respondents
1. Yes 16 80%
2. No 01 5%
3. To some extent. 03 15%
TOTAL - 20 100%

No. Of Respondents

15%
5%

a) Yes
b) No
c) To some extent.

80%

Interpretation:
Here the 80% employees are saying yes 15% are saying to some extent and 5%
are saying no.
14.There is someone at work who encourages my development.
a. Yes b. No c. To some extent.

Table Title: Encouragement By Colligues.


Sr. No Attributes No. Of Percentage
Respondents
1. Yes 13 65%
2. No 03 15%
3. To some extent. 04 20%
TOTAL - 20 100%

No. Of Respondents

20%

a) Yes
b) No
15%
c) To some extent.
65%

Interpretation:
Here the 65% employees are saying yes 20% are saying to some extent and 15%
are saying no.
15.Even if I had the opportunity to get a similar job with
another organization, I would stay with my present company.
a. Yes b. No c. To some extent.

Table Title: Job Switch Opportunity.


Sr. No Attributes No. Of Percentage
Respondents
1. Yes 07 35%
2. No 01 5%
3. To some extent. 12 60%
TOTAL - 20 100%

No. Of Respondents

35%
a) Yes
b) No
60% c) To some extent.

5%

Interpretation:

Here the 60% employees are saying to some extent 35% are saying yes and 5%
are saying no.

 The core differences between qualitative (cf. par. 6.2.1, p. 358) and
quantitative data (cf. par. 6.2.2, p. 367) analysis are as follows (Kreuger &
Neuman, 2006:434-435): Qualitative data analysis is less standardised with the
wide variety in approaches to qualitative
 research matched by the many approaches to data analysis, while quantitative
researchers choose from a specialised, standard set of data analysis techniques.
Elaborating a set of generalisations, which suggest that certain relationships
hold firm in the setting being examined, and affirming that these cover all the
known eventualities in the data set. Formalizing these theoretical constructs and
making inferences from them to other cases in
 place and time. As we have seen so far from our discussion of qualitative data
analysis, there are always variations in the number and description of steps for
the same process by different authors. To the preceding body of knowledge,
outlined by different authors, one can add the views of Watling and James
(2012:385-395). According to these authors, the process of qualitative data
analysis consists of six stages (steps), namely: Defining and identifying data.
From the outset, it is crucial to obtain a clear understanding of
 the meaning of data, and fundamentally, even more importantly, the data
required in accordance with the research question and aims. Collecting and
storing data. When collecting data, most researchers start to form opinions and
 judgement, which result in theories being developed, in the mind of the
researcher, and as such one has to consider not only ways to collect data, but
also to store data to make them accessible for analysis. So the interviews for
instance can be recorded by means of a digital recorder, transcribed and stored
(loaded) on a computer programme such as Atlas.tiTM Version 6 (Atlas.tiTM).
Data reduction and sampling. During the data collection process (cf. par. 5.8.4,
p. 330),
 reaching a point of saturation implies that all data were reduced, filtered and
sampled through the process of analysis. It is therefore critical for the researcher
when analysing data to determine what one already knows to be important or
relevant, in accordance with the intended purpose of the investigation. Stated
differently, the researcher needs to establish, on the one hand, which data are
not relevant, and on the other hand, which data encapsulate the essence and
evidence one wishes to focus on for a more detailed analysis. Hence, from the
preceding can be inferred that it is important to establish incidences and
similarities in the respective interviews. In addition, one should establish
whether the expected reactions (responses) were obtained and if there are still
deficiencies regarding certain questions. Structuring and coding data.
Structuring and coding of data underpin the key research.
 outcomes and can be used to shape the data to test, refine or confirm
established theory, apply theory to new circumstances or use it to generate a
new theory or model, or even in the case of CHAPTER 6: DATA ANALYSIS
AND INTERPRETATION 361 this study, develop a new measurement
instrument, such as a questionnaire (cf. par. 5.9.3, p. 339). During coding, the
corpus of data has to be divided into segments and these segments are assigned
codes which relate to analytic themes being developed (Fielding, 2002:163) and
applied consistently over the period of analysis and over a range of data. Basic
coding, carried out as a first step in the analysis of data, is both useful in itself
and acts as a preparation of the data for more advanced analysis at higher levels
of abstraction (Punch, 2011:175). It can therefore be deduced that structuring
and coding signifies an analytical process of elaboration of data, as for instance
obtained from semi-structured interviews in related themes, on the hands of
codes and structures to form (establish) an understandable framework and
associations derived from the language of participants. The process of coding
for this study will be considered in a later paragraph (cf. par. 6.2.2.2, p. 370).
Theory building and testing. An important purpose of research is to generate
new knowledge.
 (Watling & James, 2012:392). To this end, it might be helpful to take into
consideration the set of tactics for generating meaning from qualitative data,
described by Miles and Huberman (1994:245-246), commented on in an
ensuing paragraph. More specifically in relation to theory building and testing
as part of the process of data analysis, it can be said that based upon the created
framework, relevant diversions (distractions) can be made and insight in the
research question under investigation can be obtained. In building and testing
theory, it is important to view the reactions of respondents and whether they
correspond or not, and also to ensure that a point of saturation of data is
reached. Reporting and writing up research. In brief, the reporting and writing
up of research entails to.
 put words on paper, in the form of a report, constructing an argument based on
the findings of what you have done, what you have seen and heard, participants
you interviewed and the information that comes forth from the process of data
analysis. Ultimately, the conclusions drawn from the information should
contribute to the body of knowledge and represent new meaning and insight in
the research question.
Data interpretation refers to the implementation of processes through which data is
reviewed for the purpose of arriving at an informed conclusion. The interpretation of
data assigns a meaning to the information analyzed and determines its signification
and implications.

The importance of data interpretation is evident and this is why it needs to be


done properly. Data is very likely to arrive from multiple sources and has a
tendency to enter the analysis process with haphazard ordering. Data analysis
tends to be extremely subjective. That is to say, the nature and goal of
interpretation will vary from business to business, likely correlating to the type
of data being analyzed. While there are several different types of processes that
are implemented based on individual data nature, the two broadest and most
common categories are “quantitative analysis” and “qualitative analysis”.

Yet, before any serious data interpretation inquiry can begin, it should be
understood that visual presentations of data findings are irrelevant unless a
sound decision is made regarding scales of measurement. Before any serious
data analysis can begin, the scale of measurement must be decided for the data
as this will have a long-term impact on data interpretation ROI. The varying
scales include:

 Nominal Scale: non-numeric categories that cannot be ranked or compared


quantitatively. Variables are exclusive and exhaustive.
 Ordinal Scale: exclusive categories that are exclusive and exhaustive but with a
logical order. Quality ratings and agreement ratings are examples of ordinal
scales (i.e., good, very good, fair, etc., OR agree, strongly agree, disagree, etc.).
 Interval: a measurement scale where data is grouped into categories with
orderly and equal distances between the categories. There is always an arbitrary
zero point.
 Ratio: contains features of all three.

For a more in-depth review of scales of measurement, read our article on data
analysis questions. Once scales of measurement have been selected, it is time to
select which of the two broad interpretation processes will best suit your data
needs. Let’s take a closer look at those specific data interpretation methods and
possible data interpretation problems.

How To Interpret Data?


When interpreting data, an analyst must try to discern the differences between
correlation, causation and coincidences, as well as many other bias – but he also
has to consider all the factors involved that may have led to a result. There are
various data interpretation methods one can use.

The interpretation of data is designed to help people make sense of numerical


data that has been collected, analyzed and presented. Having a baseline method
(or methods) for interpreting data will provide your analyst teams a structure
and consistent foundation. Indeed, if several departments have different
approaches to interpret the same data, while sharing the same goals, some
mismatched objectives can result. Disparate methods will lead to duplicated
efforts, inconsistent solutions, wasted energy and inevitably – time and money.
In this part, we will look at the two main methods of interpretation of data: with
a qualitative and a quantitative analysis.

Qualitative Data Interpretation

Qualitative data analysis can be summed up in one word – categorical. With


qualitative analysis, data is not described through numerical values or patterns,
but through the use of descriptive context (i.e., text). Typically, narrative data is
gathered by employing a wide variety of person-to-person techniques. These
techniques include:

Observations: detailing behavioral patterns that occur within an observation


group. These patterns could be the amount of time spent in an activity, the type
of activity and the method of communication employed.

Documents: much like how patterns of behavior can be observed, different


types of documentation resources can be coded and divided based on the type of
material they contain.

Interviews: one of the best collection methods for narrative data. Enquiry
responses can be grouped by theme, topic or category. The interview approach
allows for highly-focused data segmentation.

A key difference between qualitative and quantitative analysis is clearly


noticeable in the interpretation stage. Qualitative data, as it is widely open to
interpretation, must be “coded” so as to facilitate the grouping and labeling of
data into identifiable themes. As person-to-person data collection techniques
can often result in disputes pertaining to proper analysis, qualitative data
analysis is often summarized through three basic principles: notice things,
collect things, think about things.

Exclusive Bonus Content: Download Free Data Analysis Guide


Get our free guide with 5 essential tips for your own data analysis.

Quantitative Data Interpretation

If quantitative data interpretation could be summed up in one word (and it really


can’t) that word would be “numerical.” There are few certainties when it comes
to data analysis, but you can be sure that if the research you are engaging in has
no numbers involved, it is not quantitative research. Quantitative analysis refers
to a set of processes by which numerical data is analyzed. More often than not,
it involves the use of statistical modeling such as standard deviation, mean and
median. Let’s quickly review the most common statistical terms:

 Mean: a mean represents a numerical average for a set of responses. When


dealing with a data set (or multiple data sets), a mean will represent a central
value of a specific set of numbers. It is the sum of the values divided by the
number of values within the data set. Other terms that can be used to describe
the concept are arithmetic mean, average and mathematical expectation.
 Standard deviation: this is another statistical term commonly appearing in
quantitative analysis. Standard deviation reveals the distribution of the
responses around the mean. It describes the degree of consistency within the
responses; together with the mean, it provides insight into data sets.
 Frequency distribution: this is a measurement gauging the rate of a response
appearance within a data set. When using a survey, for example, frequency
distribution has the capability of determining the number of times a specific
ordinal scale response appears (i.e., agree, strongly agree, disagree, etc.).
Frequency distribution is extremely keen in determining the degree of
consensus among data points.

Typically, quantitative data is measured by visually presenting correlation tests


between two or more variables of significance. Different processes can be used
together or separately, and comparisons can be made to ultimately arrive at a
conclusion. Other signature interpretation processes of quantitative data include:

 Regression analysis
 Cohort analysis
 Predictive and prescriptive analysis

Now that we have seen how to interpret data, let’s move on and ask ourselves
some questions: what are some data interpretation benefits? Why do all
industries engage in data research and analysis? These are basic questions, but
that often don’t receive adequate attention.

Why Data Interpretation Is Important

The purpose of collection and interpretation is to acquire useful and usable


information and to make the most informed decisions possible. From
businesses, to newlyweds researching their first home, data collection and
interpretation provides limitless benefits for a wide range of institutions and
individuals.

Data analysis and interpretation, regardless of method and


qualitative/quantitative status, may include the following characteristics:

 Data identification and explanation


 Comparing and contrasting of data
 Identification of data outliers
 Future predictions

Data analysis and interpretation, in the end, helps improve processes and
identify problems. It is difficult to grow and make dependable improvements
without, at the very least, minimal data collection and interpretation. What is the
key word? Dependable. Vague ideas regarding performance enhancement exist
within all institutions and industries. Yet, without proper research and analysis,
an idea is likely to remain in a stagnant state forever (i.e., minimal growth).
So… what are a few of the business benefits of digital age data analysis and
interpretation? Let’s take a look!

1) Informed decision-making: A decision is only as good as the knowledge that


formed it. Informed data decision making has the potential to set industry
leaders apart from the rest of the market pack. Studies have shown that
companies in the top third of their industries are, on average, 5% more
productive and 6% more profitable when implementing informed data decision-
making processes. Most decisive actions will arise only after a problem has
been identified or a goal defined. Data analysis should include identification,
thesis development and data collection followed by data communication.

If institutions only follow that simple order, one that we should all be familiar
with from grade school science fairs, then they will be able to solve issues as
they emerge in real time. Informed decision making has a tendency to be
cyclical. This means there is really no end, and eventually, new questions and
conditions arise within the process that need to be studied further. The
monitoring of data results will inevitably return the process to the start with new
data and sights.

2) Anticipating needs with trends identification: data insights provide


knowledge, and knowledge is power. The insights obtained from market and
consumer data analyses have the ability to set trends for peers within similar
market segments. A perfect example of how data analysis can impact trend
prediction can be evidenced in the music identification application, Shazam.
The application allows users to upload an audio clip of a song they like, but
can’t seem to identify. Users make 15 million song identifications a day. With
this data, Shazam has been instrumental in predicting future popular artists.

When industry trends are identified, they can then serve a greater industry
purpose. For example, the insights from Shazam’s monitoring benefits not only
Shazam in understanding how to meet consumer needs, but it grants music
executives and record label companies an insight into the pop-culture scene of
the day. Data gathering and interpretation processes can allow for industry-wide
climate prediction and result in greater revenue streams across the market. For
this reason, all institutions should follow the basic data cycle of collection,
interpretation, decision making and monitoring.
3) Cost efficiency: Proper implementation of data analysis processes can
provide businesses with profound cost advantages within their industries. A
recent data study performed by Deloitte vividly demonstrates this in finding that
data analysis ROI is driven by efficient cost reductions. Often, this benefit is
overlooked because making money is typically viewed as “sexier” than saving
money. Yet, sound data analyses have the ability to alert management to cost-
reduction opportunities without any significant exertion of effort on the part of
human capital.

A great example of the potential for cost efficiency through data analysis is
Intel. Prior to 2012, Intel would conduct over 19,000 manufacturing function
tests on their chips before they could be deemed acceptable for release. To cut
costs and reduce test time, Intel implemented predictive data analyses. By using
historic and current data, Intel now avoids testing each chip 19,000 times by
focusing on specific and individual chip tests. After its implementation in 2012,
Intel saved over $3 million in manufacturing costs. Cost reduction may not be
as “sexy” as data profit, but as Intel proves, it is a benefit of data analysis that
should not be neglected.

4) Clear foresight: companies that collect and analyze their data gain better
knowledge about themselves, their processes and performance. They can
identify performance challenges when they arise and take action to overcome
them. Data interpretation through visual representations lets them process their
findings faster and make better-informed decisions on the future of the
company.

The oft-repeated mantra of those who fear data advancements in the digital age
is “big data equals big trouble.” While that statement is not accurate, it is safe to
say that certain data interpretation problems or “pitfalls” exist and can occur
when analyzing data, especially at the speed of thought. Let’s identify three of
the most common data misinterpretation risks and shed some light on how they
can be avoided:

1) Correlation mistaken for causation: our first misinterpretation of data refers


to the tendency of data analysts to mix the cause of a phenomenon with
correlation. It is the assumption that because two actions occurred together, one
caused the other. This is not accurate as actions can occur together absent a
cause and effect relationship.
 Digital age example: assuming that increased revenue is the result of increased
social media followers… there might a definitive correlation between the two,
especially with today’s multi-channel purchasing experiences. But, that does not
mean an increase in followers is the direct cause of increased revenue. There
could be both a common cause or an indirect causality.
 Remedy: attempt to eliminate the variable you believe to be causing the
phenomenon.

2) Confirmation bias: our second data interpretation problem occurs when you
have a theory or hypothesis in mind, but are intent on only discovering data
patterns that provide support, while rejecting those that do not.

 Digital age example: your boss asks you to analyze the success of a recent
multi-platform social media marketing campaign. While analyzing the potential
data variables from the campaign (one that you ran and believe performed well),
you see that the share rate for Facebook posts were great, while the share rate
for Twitter Tweets were not. Using only the Facebook posts to prove your
hypothesis that the campaign was successful would be a perfect manifestation
of confirmation bias.
 Remedy: as this pitfall is often based on subjective desires, one remedy would
be to analyze data with a team of objective individuals. If this is not possible,
another solution is to resist the urge to make a conclusion before data
exploration has been completed. Remember to always try to disprove a
hypothesis, not prove it.

3) Irrelevant data: the third and final data misinterpretation pitfall is especially
important in the digital age. As large data is no longer centrally stored, and as it
continues to be analyzed at the speed of thought, it is inevitable that analysts
will focus on data that is irrelevant to the problem they are trying to correct.

 Digital age example: in attempting to gauge the success of an email lead


generation campaign, you notice that the number of homepage views directly
resulting from the campaign increased, but the number of monthly newsletter
subscribers did not. Based on the number of homepage views, you decide the
campaign was a success when really it generated zero leads.
 Remedy: proactively and clearly frame any data analysis variables and KPIs
prior to engaging in a data review. If the metric you are using to measure the
success of a lead generation campaign is newsletter subscribers, there is no need
to review the number of homepage visits. Be sure to focus on the data variable
that answers your question or solves your problem and not on irrelevant data.
Interpretation of Data: The Use of Dashboards Bridging the Gap
As we have seen, quantitative and qualitative methods are distinct types of data
analyses. Both offer a varying degree of return on investment (ROI) regarding
data investigation, testing and decision-making. Because of their differences, it
is important to understand how dashboards can be implemented to bridge the
quantitative and qualitative information gap. How are digital data dashboard
solutions playing a key role in merging the data disconnect? Here are a few of
the ways:

1) Connecting and blending data. With today’s pace of innovation, it is no


longer feasible (nor desirable) to have bulk data centrally located. As businesses
continue to globalize and borders continue to dissolve, it will become
increasingly important for businesses to possess the capability to run diverse
data analyses absent the limitations of location. Data dashboards decentralize
data without compromising on the necessary speed of thought while blending
both quantitative and qualitative data. Whether you want to measure customer
trends or organizational performance, you now have the capability to do both
without the need for a singular selection.

2) Mobile Data. Related to the notion of “connected and blended data” is that
of mobile data. In today’s digital world, employees are spending less time at
their desks and simultaneously increasing production. This is made possible by
the fact that mobile solutions for analytical tools are no longer standalone.
Today, mobile analysis applications seamlessly integrate with everyday
business tools. In turn, both quantitative and qualitative data are now available
on demand where they’re needed, when they’re needed and how they’re needed.

3) Visualization. Data dashboards are merging the data gap between qualitative
and quantitative methods of interpretation of data, through the science of
visualization. Dashboard solutions come “out of the box” well-equipped to
create easy-to-understand data demonstrations. Modern online data
visualization tools provide a variety of color and filter patterns, encourage user
interaction and are engineered to help enhance future trend predictability. All of
these visual characteristics make for an easy transition among data methods –
you only need to find the right types of data visualization to tell your data story
the best way possible.

To give you an idea of how a market research dashboard fulfils the need of
bridging quantitative and qualitative analysis, and helps in understanding how
to interpret data in research thanks to visualization, have a look at the following
one. It brings together both qualitative and quantitative data knowledgeably
Chapter-V

Findings, Conclusions, Suggestions


Findings
The return on any investment, measured over a given period of time, is simply
the sum of its capital appreciation and any income generated divided by the
original amount of the investment, which is expressed as a percentage. The term
applied to this composite calculation is total return.

However, there is a difference in this simple concept as applied to stocks and


mutual funds. Unfortunately, a great many mutual fund investors do not seem to
have a clear understanding of a fund's total return. The relationships between a
fund's net asset value (NAV), yield (income) and capital gains distributions can
be confusing. For stock investors, calculating and understanding their total
return is relatively easy. By comparing how total return is derived for both
stocks and mutual funds, you'll be able to better understand how this measure
worksformutualfunds.

StockTotalReturn
We begin our illustration with a share of XYZ Company that is bought for $30
at the beginning of the year. During the year, its price fluctuates, but it closes
the year at $33, which represents a nice percentage return on the investment of
10% ($3/$30).

But, things get even better because XYZ paid an annual dividend of $1 per
share. This dividend equals an additional 3.3% return ($1/$30). Adding together
the capital appreciation (price increase) of 10% and the income return
(dividend) of 3.3% gives us a one-year total return for XYZ Company stock of
13.3%. However, remember that unless you sell XYZ stock, the price
appreciation gain remains in the stock price, or is unrealized. (For more on this
concept,see With mutual funds, explaining total return is a bit more
complicated. We begin with a share of the ABC Fund, which is purchased at its
net asset value (price) of $16 per share. A fund's NAV is derived by dividing the
value of its portfolio securities (the fund's assets), less any accrued fees and
expenses (the fund's liabilities), by the number of fund shares outstanding.
Here's an illustration of the computation of net asset value for the ABC Fund:

The fund's stock holdings at market prices:


Remember that mutual funds are priced once a day, at the end of the day.
Unlike stocks, where prices are moved by the supply and demand forces of the
marketplace, fund prices are determined by the value of the underlying
securitiesinthefund.

In our example, ABC is a hybrid stock/bond fund with a growth-income


orientation. Apart from capital gains, its individual portfolio holdings will
generate dividends and interest. By law, mutual funds must distribute these to
the fund's shareholders. ABC's income distribution (its dividends to
shareholders) for the year amounted to $1 per share. In addition, the fund's
trading activities (the buying and selling of securities) generated a realized
capital gain of $3 per share, which ABC also distributed to its shareholders.

The ABC Fund passed along all the earnings and capital appreciation it
generated - $4 ($1 in dividend distributions and $3 in a capital gains
distribution) to its shareholders for a total return of 25% ($4/$16). Here again,
unlike a stock, by paying out all its capital gains, the ABC Fund's price, or
NAV, remains at or close to $16. In this scenario, if a fund investor only
focused on the movement in ABC's NAV, the results would not look very good.
It's even possible for a fund's NAV to decline, but still have good
income/capital gain distributions, which will be reflected in a positive total
return.

Obviously, a fund's NAV does not tell the whole mutual fund performance
story, but its total return does. It captures a fund's changes in NAV, its income
distribution and capital gains distribution, which, as a whole, are the true test of
fund's return on investment.
Conclusions
Let's recap what we've learned in this mutual fund tutorial:

 A mutual fund brings together a large group of people and invests their
aggregated money in stocks, bonds, and other securities.
 The advantages of mutual funds are professional
management, diversification, economies of scale, and wide range of
offerings.
 The disadvantages of mutuals are high costs, over-diversification,
possible tax consequences, liquidity concerns, and the inability of
management to guarantee a superior return.
 There are many, many types of mutual funds. You can classify funds
based on asset class, investing strategy, region, etc.
 Mutual funds have expenses that can be broken down generally into
ongoing fees (represented by the expense ratio) and transaction fees
(loads).
 Some funds carry no broker fee, known as no-load mutual funds.
 One of the biggest problems with mutual funds are their costs and fees.
 Mutual funds are easy to buy and sell. You can either buy them directly
from the fund company or through a third party.
 Comparing fund returns across a number of metrics is important, such as
over time, compared to its benchmark, and compared to other funds in its
peer group.
Suggestions
1. Expenses-Ratio
Mutual funds do not run themselves. They need to be managed and this
management is not free! The expenses to operate a mutual fund can be as
involved as a corporation. But all you need to know is that higher
expenses do not always translate into higher mutual fund returns. In fact,
lower expenses usually translate into higher returns, especially over long
periods of time.
But what expense ratio is high? Which is best? When doing your
research, keep in mind average expense ratios for mutual funds. Here are
Never buy a mutual fund with expense ratios higher than these! Notice
that the average expenses change by fund category. The fundamental
reason for this is that research costs for portfolio management are higher
for certain niche areas, such as small-cap stocks and foreign stocks,
where information is not as readily available compared to large domestic
companies. Also, index funds are passively managed. Therefore costs can
be kept extremely low.

1.
Manager Tenure refers to the amount of time, usually measured in years,
a mutual fund manager or management team has been managing a
particular,mutual.fund.
Manager tenure is most important to know when investing in actively-
managed mutual funds. Managers of actively managed funds are actively
trying to outperform a particular benchmark, such as the S&P 500;
whereas the manager of a passively-managed fund is only investing in the
same,securities,asthebenchmark.
When looking at a mutual fund's historical performance, be sure to
confirm the manager or management team has been managing the fund
for the time frame you are reviewing. For example, if you are attracted to
the 5-year return of a mutual fund but the manager tenure is only one
year, the 5-year return is not meaningful in making the decision to buy
this fund.
BIBILIGROPHY

References

 books
 journals
 magazines

Websites

 www.imasion.com
 www.imasionindia.com
 www.managementstudyguide.com
 www.citehr.com

Vous aimerez peut-être aussi