Vous êtes sur la page 1sur 24

Name : Nur Amirah Inshirah binti Ismail

Class : 5 Pendita Za’ba

I/C Number : 930224 – 08 – 5308


CONTENTS
Page

Appreciation
Introduction
Part 1
Part 2
Part 3
Part 4
Part 5
Further Exploration
Reflection
After weeks of struggle and hard work to complete assignment given to us
by our teacher, ………………………………………………., I finally did it within 2 weeks with
satisfaction and senses of success because I have understood more deeply about
the interest and investment more than before. I have to be grateful and thankful
to all parties who have helped me in the process of completing my assignment.
It was a great experience for me as I have learnt to be more independent and to
work as group. For this, I would like to take this opportunity to express my
thankfulness once again to all parties concerned.

Firstly, I would like to thanks my Additional Mathematics’ teacher,


…………………………………………..for patiently explained to us the proper and precise
way to complete this assignment. With her help and guidance, many problems I
have encountered had been solved.

Besides that, I would like to thanks to my parents for their support and
encouragement that they gave to me. In addition, my parents also gave me
guidance on the methods to account for investment which have greatly
enhanced my knowledge on particular area. Last but not least, I would like to
express my thankfulness to my cousin and friends, who have patiently explained
to me and did this project with me in group.
Most experimental searches for paranormal phenomena are statistical in nature.
A subject repeatedly attempts a task with a known probability of success due to
chance, then the number of actual successes is compared to the chance
expectation. If a subject scores consistently higher or lower than the chance
expectation after a large number of attempts, one can calculate the probability
of such a score due purely to chance, and then argue, if the chance probability is
sufficiently small, that the results are evidence for the existence of some
mechanism (precognition, telepathy, psychokinesis, cheating, etc.) which
allowed the subject to perform better than chance would seem to permit.

Claims of evidence for the paranormal are usually based upon statistics
which diverge so far from the expectation due to chance that some other
mechanism seems necessary to explain the experimental results. To interpret
the results of our RetroPsychoKinesis experiments, we'll be using the
mathematics of probability and statistics, so it's worth spending some time
explaining how we go about quantifying the consequences of chance.
A) History of Probability
Probability is a way of expressing knowledge or belief that an event will occur or has
occurred. In mathematics the concept has been given an exact meaning in probability theory,
that is used extensively in such area of study as mathematics, statistics, finance, gambling,
science, and phisolophy to draw conclusions about the likelihood of potential events and
underlying mechanics of complex systems.

The scientific study of probability is a modern development. Gambling shows that


there has been an interest in quantifying the ideas of probability for millennia, but exact
mathematical descriptions of use in those problems only arose much later.

According to Richard Jeffrey, "Before the middle of the seventeenth century, the term
'probable' (Latin probabilis) meant approvable, and was applied in that sense, univocally, to
opinion and to action. A probable action or opinion was one such as sensible people would
undertake or hold, in the circumstances."[4] However, in legal contexts especially, 'probable'
could also apply to propositions for which there was good evidence.

Aside from some elementary considerations made by Girolamo Cardano in the 16th
century, the doctrine of probabilities dates to the correspondence of Pierre de
Fermat and Blaise Pascal (1654). Christiaan Huygens (1657) gave the earliest known
scientific treatment of the subject. Jakob Bernoulli's Ars Conjectandi (posthumous, 1713)
and Abraham de Moivre'sDoctrine of Chances (1718) treated the subject as a branch of
mathematics. See Ian Hacking's The Emergence of Probability and James Franklin's The
Science of Conjecture for histories of the early development of the very concept of
mathematical probability.

The theory of errors may be traced back to Roger Cotes's Opera


Miscellanea (posthumous, 1722), but a memoir prepared by Thomas Simpson in 1755
(printed 1756) first applied the theory to the discussion of errors of observation. The reprint
(1757) of this memoir lays down the axioms that positive and negative errors are equally
probable, and that there are certain assignable limits within which all errors may be supposed
to fall; continuous errors are discussed and a probability curve is given.

Pierre-Simon Laplace (1774) made the first attempt to deduce a rule for the combination of
observations from the principles of the theory of probabilities. He represented the law of
probability of errors by a curve y = φ(x), x being any error and y its probability, and laid
down three properties of this curve:

1. it is symmetric as to the y-axis;

2. the x-axis is an asymptote, the probability of the error being 0;


3. the area enclosed is 1, it being certain that an error exists.
He also gave (1781) a formula for the law of facility of error (a term due to Lagrange, 1774),
but one which led to unmanageable equations. Daniel Bernoulli (1778) introduced the
principle of the maximum product of the probabilities of a system of concurrent errors.

Probability Theory

Like other theories, the theory of probability is a representation of probabilistic


concepts in formal terms—that is, in terms that can be considered separately from their
meaning. These formal terms are manipulated by the rules of mathematics and logic, and any
results are then interpreted or translated back into the problem domain.

There have been at least two successful attempts to formalize probability, namely
the Kolmogorov formulation and the Cox formulation. In Kolmogorov's formulation
(see probability space), sets are interpreted as events and probability itself as a measure on a
class of sets. In Cox's theorem, probability is taken as a primitive (that is, not further
analyzed) and the emphasis is on constructing a consistent assignment of probability values
to propositions. In both cases, the laws of probability are the same, except for technical
details.

There are other methods for quantifying uncertainty, such as the Dempster-Shafer
theory or possibility theory, but those are essentially different and not compatible with the
laws of probability as they are usually understood.

Probability Theory Applications

Two major applications of probability theory in everyday life are in risk assessment
and in trade on commodity markets. Governments typically apply probabilistic methods
inenvironmental regulation where it is called "pathway analysis", often measuring well-
being using methods that are stochastic in nature, and choosing projects to undertake based
on statistical analyses of their probable effect on the population as a whole.

A good example is the effect of the perceived probability of any widespread Middle
East conflict on oil prices - which have ripple effects in the economy as a whole. An
assessment by a commodity trader that a war is more likely vs. less likely sends prices up or
down, and signals other traders of that opinion. Accordingly, the probabilities are not
assessed independently nor necessarily very rationally. The theory of behavioral
finance emerged to describe the effect of such groupthink on pricing, on policy, and on peace
and conflict.

It can reasonably be said that the discovery of rigorous methods to assess and
combine probability assessments has had a profound effect on modern society. Accordingly,
it may be of some importance to most citizens to understand how odds and probability
assessments are made, and how they contribute to reputations and to decisions, especially in
a democracy.
Another significant application of probability theory in everyday life is reliability.
Many consumer products, such as automobiles and consumer electronics, utilize reliability
theory in the design of the product in order to reduce the probability of failure. The
probability of failure may be closely associated with the product's warranty

A) CATEGORIES OF
PROBABILITY
Empirical Probability of an event is an "estimate" that the event will happen based on how
often the event occurs after collecting data or running an experiment (in a large number of
trials). It is based specifically on direct observations or experiences.

Empirical Probability Formula

P(E) = probability that an event, E, will occur.


top = number of ways the specific event occurs.
bottom = number of ways the experiment could
occur.

Theoretical Probability of an event is the number of ways that the event can occur, divided
by the total number of outcomes. It is finding the probability of events that come from a
sample space of known equally likely outcomes.

Theoretical Probability Formula


P(E) = probability that an event, E, will occur.
n(E) = number of equally likely outcomes of E.
n(S) = number of equally likely outcomes of sample
space S.

Comparing Empirical and Theoretical Probabilities:


Empirical probability is the probability a person calculates from many different trials.
For example someone can flip a coin 100 times and then record how many times it came up
heads and how many times it came up tails. The number of recorded heads divided by 100 is
the empirical probability that one gets heads.

The theoretical probability is the result that one should get if an infinite number of
trials were done. One would expect the probability of heads to be 0.5 and the probability of
tails to be 0.5 for a fair coin.
(a) Possible outcomes when the dice is tossed once

{1,2,3,4,5,6}

(b)Possible outcomes when two dice are tossed simultaneously

Chart
DICE2

6 (1,6) (2,6) (3,6) (4,6) (5,6) (6,6)

5 (1,5) (2,5) (3,5) (4,5) (5,5) (6,5)

4 (1,4) (2,4) (3,4) (4,4) (5,4) (6,6)

3 (1,3) (2,3) (3,3) (4,3) (5,3) (6,3)

2 (1,2) (2,2) (3,2) (4,2) (5,2) (6,2)

1 (1,1) (2,1) (3,1) (4,1) (5,1) (6,1)


0
1 2 3 4 5 6 DICE 1

Table
1 2 3 4 5 6
1 (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
2 (2,1) (2,2) (2,3) (2,4) (2,5) (2,6)
3 (3,1) (3,2) (3,3) (3,4) (3,5) (3,6)
4 (4,1) (4,2) (4,3) (4,4) (4,5) (4,6)
5 (5,1) (5,2) (5,3) (5,4) (5,5) (5,6)
6 (6,1) (6,2) (6,3) (6,4) (6,5) (6,6)

Tree diagram
(a)

SUM OF THE TWO


NUMBERS (x)
POSSIBLE OUTCOMES PROBABILITIY P(x)
2 (1,1) 1/36
3 (1,2),(2,1) 2/36
4 (1,3),(2,2),(3,1) 3/36
5 (1,4),(2,3),(3,2),(4,1) 4/36
6 (1,5),(2,4),(3,3),(4,2),(5,1) 5/36
7 (1,6),(2,5),(3,4),(4,3),(5,2), 6/36
(6,1)
8 (2,6),(3,5),(4,4),(5,3),(6,2) 5/36
9 (3,6),(4,5),(5,4),(6,3) 4/36
10 (4,6),(5,5),(6,4) 3/36
11 (5,6),(6,5) 2/36
12 (6,6) 1/36

(b)

Possible outcomes of

A = {the two numbers are not the same}

A = { (1,1), (1,2), (1,3), (1,4), (1,5), (1,6), (2,1), (2,2), (2,3), (2,4), (2,5), (2,6), (3,1), (3,2),
(3,3), (3,4), (3,5), (3,6), (4,1), (4,2), (4,3), (4,4), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,5),
(5,6), (6,1), (6,2), (6,3), (6,4), (6,5), (6,6)}

B = {the product of the two numbers is greater than 36}

B={}

C = {both numbers are prime or the difference between the two number is odd}
C = {(2,2), (2,3), (2,5), (3,2), (3,3), (3,5), (5,2), (5,3), (5,5), (1,2), (1,4), (1,6), (2,1), (3,4),
(3,6), (4,1), (4,3), (4,5), (5,4), (5,6), (6,1), (6,3), (6,5)}

D = {the sum of the two numbers are even and both numbers are prime}

D = {(2,2), (3,3), (3,5), (5,3), (5,5)}

P A= 56

P B=0

P C= 2336

P D= 536
(a) Using the formula

∑fx2f - x2 = Variance

Sum of the two Frequency (f) fx x2 fx2


numbers (x)
2 1 2 4 4
3 3 9 9 27
4 3 12 16 48
5 7 35 25 175
6 4 24 36 144
7 8 56 49 392
8 7 56 64 448
9 8 72 81 648
10 2 20 100 200
11 5 55 121 605
12 2 24 144 288
f=50 fx=365 fx2=2979
Mean , x = 36550 = 7.3

Variance = ∑fx2f - x2 = 297950 – 7.32 = 6.29

Standard Deviation = 2.508

(b) Prediction of mean value if the number tosses is increased to 100 times
= 6.9

(c) Using the Formula

Variance = ∑fx2f - x2

Sum of the two Frequency (f) fx x2 fx2


numbers (x)
2 2 4 4 8
3 6 18 9 54
4 8 32 16 128
5 13 65 25 325
6 13 78 36 468
7 16 112 49 784
8 14 112 64 896
9 12 108 81 972
10 10 100 100 1000
11 4 44 121 484
12 2 24 144 288
f=100 fx=697 fx2=5407

Mean , x = 697100
= 6.97

Variance = 5407100- 6.972


= 5.489

Standard Deviation = 2.343


(a)

Sum of the two P(x) x P(x) x2 x2 P(x)


numbers (x)
2 136 236 4 436

3 236 636 9 1836

4 336 1236 16 4836

5 436 2036 25 10036

6 536 3036 36 18036

7 636 4236 49 29436

8 536 4036 64 32036

9 436 3636 81 32436

10 336 3036 100 30036

11 236 2236 121 24236

12 136 1236 144 14436

xpx= 25236 x2 px=


197436

Mean , x = 25236 = 7
Variance = 197436 – 72 = 54.833 – 49 = 5.833

Standard Deviation = 2.415


(b) Comparison of mean, variance and standard deviation from Part 4 and Part 5

Part 4 (a) Part 4 (c) Part 5


Mean 7.3 6.97 7
Variance 6.29 5.489 5.833
Standard Deviation 2.508 2.343 2.415

We can see that, the mean, variance and standard deviation that we obtained through
experiment in Part 4 are different but close to the theoretical value in Part 5.

For mean, when the number of trial increased from n=50 to n=100, its value get closer (from
7 to 6.97) to the theoretical value. This is in accordance to the Law of Large Number. We
will discuss Law of Large Number in next section.

Nevertheless, the empirical variance and empirical standard deviation that we obtained in part
4 get further from the theoretical value in part 5. This violates the Law of Large Number.
This is probably due to

a. The sample (n=100) is not large enough to see the change of value of mean, variance
and standard deviation.
b. Law of Large Number is not an absolute law. Violation of this law is still possible
though the probability is relative low.

In conclusion, the empirical mean, variance and standard deviation can be different from the
theoretical value. When the number of trial (number of sample) getting bigger, the empirical
value should get closer to the theoretical value. However, violation of this rule is still
possible, especially when the number of trial (or sample) is not large enough.

(a) Range of the Mean

n Mean
50 7.3
100 6.97

As n changes, the range of the mean of the sum of the two numbers is between 6 and 8.
From the probability distribution graph , it shows that the range of mean is between 6 and 8.

In probability theory, the law of large numbers (LLN) is a theorem that describes the
result of performing the same experiment a large number of times. According to the law,
the average of the results obtained from a large number of trials should be close to
the expected value, and will tend to become closer as more trials are performed.

For example, a single roll of a six-sided die produces one of the numbers 1, 2, 3, 4, 5, 6, each
with equalprobability. Therefore, the expected value of a single die roll is

According to the law of large numbers, if a large number of dice are rolled, the
average of their values (sometimes called the sample mean) is likely to be close to 3.5, with
the accuracy increasing as more dice are rolled.
Similarly, when a fair coin is flipped once, the expected value of the number of heads
is equal to one half. Therefore, according to the law of large numbers, the proportion of heads
in a large number of coin flips should be roughly one half. In particular, the proportion of
heads after n flips will almost surely converge to one half as napproaches infinity.

Though the proportion of heads (and tails) approaches half, almost surely the absolute
(nominal) difference in the number of heads and tails will become large as the number of
flips becomes large. That is, the probability that the absolute difference is a small number
approaches zero as number of flips becomes large. Also, almost surely the ratio of the
absolute difference to number of flips will approach zero. Intuitively, expected absolute
difference grows, but at a slower rate than the number of flips, as the number of flips grows.

The LLN is important because it "guarantees" stable long-term results for random
events. For example, while a casino may lose money in a single spin of the roulette wheel, its
earnings will tend towards a predictable percentage over a large number of spins. Any
winning streak by a player will eventually be overcome by the parameters of the game. It is
important to remember that the LLN only applies (as the name indicates) when a large
number of observations are considered. There is no principle that a small number of
observations will converge to the expected value or that a streak of one value will
immediately be "balanced" by the others.
While I was conducting the project, I had learned many moral values that I
practice. This project work had taught me to be more confident when doing something
especially the homework given by the teacher. I also learned to be a disciplined type of
student which is always sharp on time while doing some work, complete the work by myself
and researching the informations from the internet. I also felt very enjoy when making this
project during the school holidays as I spent my times with friends to complete this project
and it had tighten our friendship .

Vous aimerez peut-être aussi