Vous êtes sur la page 1sur 5

PB0003 Statistics for Management

(3 credits)
Assignment 1

1. Briefly explain the functions and limitations of Statistics in your own


words.

Ans. The Functions and limitations of Statistics are:


Important functions of Statistics are:

1. It simplifies complexity of the data: Complex numerical data are simplified by the
application of statistical methods. For instance, complex data regarding varying
costs and prices of commodities of daily use can be reduced to the form of cost
of living index number. This can be understood easily.

2. It reduces the bulk of the data: Voluminous data could be reduced to a few
figures making them easily understandable.

3. It adds precision to thinking: Statistics sharpens one’s thinking.

4. It helps in comparing different sets of figures: The imports and exports of a


country may be compared among themselves or they may be compared with
those of another country.

5. It guides in the formulation of policies and helps in planning: Planning and policy
making by the government is based on statistics of production, demand, etc

6. It indicates trends and tendencies: Knowledge of trend and tendencies helps


future planning.

7. It helps in studying relationship between different factors: Statistical methods may


be used for studying the relation between production and price of commodities.

The Limitations of Statistics

Major limitations of Statistics are:

1. Statistics does not deal with qualitative data. It deals only with quantitative data:
Statistical methods can be applied only to numerically expressed data.
Qualitative characteristics can be studied only if an alternative method of
numerical measurement is introduced.

2. Statistics does not deal with individual fact: Statistical methods can be applied
only to aggregate of facts. Single fact cannot be statistically studied.

3. Statistical can be misused: Increasing misuse of Statistics has led to increasing


distrust in Statistics.

4. Statistical inferences are not exact: Statistical inferences are true only on an
average. They are probabilistic statements.
5. Common men cannot handle Statistics properly: Only statisticians can handle
statistics properly. An illogical analysis of statistical data leads to statistical fallacies.

2. A Survey of 128 smokers revealed the following frequency distribution of


daily expenditure on smoking of these smokers. Find the mean daily
expenditure.

Expenditure (Rs.) 10-20 20-30 30-40 40-50 50-60 60-70 70-80


No. of smokers 23 44 35 12 9 3 2

Solution:

Expenditure (Rs.) Frequency (f) Mid-value (x) fx


10-20 23 15 345
20-30 44 25 1100
30-40 35 35 1225
40-50 12 45 540
50-60 9 55 495
60-70 3 65 195
70-80 2 75 150
Total 128 - 4050

The mean is

X=∑fx= 4050= Rs. 31.64


N 128

The mean daily expenditure is Rs. 31.64.

3. What do you mean by Marginal Probilities under statistical dependence?

Ans. A marginal or unconditional probability is the simple of the occurrence of an


event. In a fair coin toss, P(H))=0.5, and P(T)=0.5; that is, the probability of
heads equals 0.5 and the probability of tails equals 0.5. This is true for every
toss, no matter how many tosses have been made or what their outcomes have
been. Every toss stands along and is in no way connected with any other loss.
Thus, the outcome of each toss of a fair coin is an event that is statistically
independent of the outcomes of every other toss of the coin.

Imagine that we have a biased or unfair coin that has been altered in such a way
that heads occurs 0.90 of the time and tails 0.10 of the time. On each individual
toss, P(H)=0.90, and P(T)=0.10. The outcome of any particular toss is completely
unrelated to the outcomes of the tosses that may precede or follow it. The
outcomes of several tosses of this coin are statistically independent events too,
even though the coin is biased.
Marginal probabilities under statistical dependence are computed by summing up
the probabilities of all the joint events in which the simple event occurs. In the
example above, we can compute the marginal probability of the event colored by
summing the probabilities of the two joint events in which colored occurred:

P(C) =P(CD) + P(CS) = 0.3+0.1= 0.4

Similarly, the marginal probability of the event gray can be computed by summing
the probabilities of the 2 joint events in which gray occurred:

P(G) = P(GD) + P(GS) = 0.2+.04 =0.6

In like manner, we can compute the marginal probability of the event dotted by
summing the probabilities of the two joints events in which dotted occurred:

P(D) = P(CD) + P(GD) = 0.3+0.2= 0.5

And, finally, the marginal probability of the event striped can be computed by
summing the probabilities of the 2 joint events in which gray occurred:

P (CS) = P(CS) + P(GS) = 0.1+ 0.4 =0.5

4. The probabilities of three events A,B and C occurring are P(A)=0.35, P(B)=
0.45 and P(C) =0.2. Assuming that A,B and C has occurred, the probabilities
of another event X occurring are P(X/A) =0.8, P(X/B) = 0.65 and P(X/C) = 0.3.
Find P(A/X), P(B/X) and P(C/X).

Ans.

Event P(Event) P(X/Event) P(X and Event) P(Event/X)= P(Event


& X)/P(X)
A 0.35 0.8 =0.3*0.8= 0.2800/0.6325=
0.2800 0.4427
B 0.45 0.65 =0.45*0.65 = 0.2925/0.6325=
0.2925 0.4625
C 0.2 0.3 =0.2*0.3= 0.0600/0.6325=
0.0600 0.0949
P (X) = P(XA)+P(XB)+P(XC) 0.6325

Therefore P(A/X) = 0.4427, P(B/X) = 0.4625 and P(C/X)= 0.0949

5. Write short notes on Bernoulli distribution.

Ans.
One widely used probability distribution of a discrete random variable is the
binomial distribution. It describes a variety of processes of interest to mangers.
The describes a variety of processes of interest to managers. The binomial
distribution describes discrete, not continuous, data, resulting from an experiment
known as Bernoulli process, after the seventeenth-century Swiss mathematician
Jacob Bernoulli. The tossing of a fair coin a fixed number of times is a Bernoulli
process, and the outcomes of such tosses can be represented by the binomial
probability distribution. The success of failure of interviewees on a aptitude test
may also be described by a Bernoulli process. On the other hand, the frequency
distribution of the lives of fluorescent lights in a factory would be measured on a
continuous scale of hours and would not qualify as binomial distribution.

Use of the Bernoulli Process

We can use the outcomes of a fixed number of tosses of a fair coin as an


example of a Bernoulli process. We can describe this process as follows:

1. Each trial (each toss, in this case) has only two possible outcomes; heads or
tails, yes or no, success or failure.

2. The probability of the outcome of any trial (toss) remains fixed overtime. With a
fair coin, the probability of heads remains 0.5 for each toss regardless of the
number of times that coin is tossed.

3. The trials are statistically independent; that is, the outcome of one toss does
not affect the outcome of any other toss.

The probability of r successes in n trials is given as : nCr Pr Qn-r

= n!__Pr Qn-r
r! (n-r)!

The mean of a Binomial distribution is given as µ =np and

The standard deviation of a binomial distribution as σ=√npq.

Using of Bernoulli process

One of the requirements for using as Bernoulli process is that that probability of
the outcome must be fixed over time. This is a very difficult condition to meet in
practice. Even a fully automatic machine making parts will experience some wear
as the number of parts increases and this will affect the probability of producing
acceptable parts. Still another condition for its uses that the trials (manufacture of
parts in our machine example) be independent. This too is a condition that is
hard to meet. If our machine produces a long series of bad parts, this could affect
the position (or sharpness) of the metal-cutting tool in the machine. Here, as in
every other situation, going from the textbook to the real world is often difficult,
and smart managers use their experience and intuition to known when a
Bernoulli process is appropriate.

6. What do you mean by Stratified Sampling?


Ans.
In stratified sampling, we divide the population into relatively homogeneous
groups, called strata. Then we use one of two approaches. Either we select at
random from each stratum a specified number of elements corresponding to the
proportion of that stratum in the population as a whole or we draw an equal
number of elements from each stratum and give weight to the results according,
stratified sampling guarantees that every element in the population has a chance
of being selected.

Uses of Stratified Sampling

Stratified Sampling is appropriate when the population is already divided into


groups of different sizes and we wish to acknowledge this fact. Suppose that a
Doctor’s patients are divided into four groups say, according to age. The doctor
wants to find out how many hours his patients sleep. To obtain an estimate of this
characteristic of the population, he could take a random sample from each of the
four age groups and give weight to the samples according to the percentage of
patients in that group. This would be an example of a stratified sample.

The advantage of stratified samples is that when they are properly designed,
they more accurately reflect characteristics of the population from which they
were chosen than do other kinds of samples.

The advantage of stratified samples is that when they are properly designed,
they more accurately reflect characteristics of the population from which they
were chosen than do other do other kinds of samples.

Vous aimerez peut-être aussi